Coder Social home page Coder Social logo

boto3's Introduction

Boto3 - The AWS SDK for Python

Package Version Python Versions License

Boto3 is the Amazon Web Services (AWS) Software Development Kit (SDK) for Python, which allows Python developers to write software that makes use of services like Amazon S3 and Amazon EC2. You can find the latest, most up to date, documentation at our doc site, including a list of services that are supported.

Boto3 is maintained and published by Amazon Web Services.

Boto (pronounced boh-toh) was named after the fresh water dolphin native to the Amazon river. The name was chosen by the author of the original Boto library, Mitch Garnaat, as a reference to the company.

Notices

On 2023-12-13, support for Python 3.7 ended for Boto3. This follows the Python Software Foundation end of support for the runtime which occurred on 2023-06-27. For more information, see this blog post.

Getting Started

Assuming that you have a supported version of Python installed, you can first set up your environment with:

$ python -m venv .venv
...
$ . .venv/bin/activate

Then, you can install boto3 from PyPI with:

$ python -m pip install boto3

or install from source with:

$ git clone https://github.com/boto/boto3.git
$ cd boto3
$ python -m pip install -r requirements.txt
$ python -m pip install -e .

Using Boto3

After installing boto3

Next, set up credentials (in e.g. ~/.aws/credentials):

[default]
aws_access_key_id = YOUR_KEY
aws_secret_access_key = YOUR_SECRET

Then, set up a default region (in e.g. ~/.aws/config):

[default]
region=us-east-1

Other credential configuration methods can be found here

Then, from a Python interpreter:

>>> import boto3
>>> s3 = boto3.resource('s3')
>>> for bucket in s3.buckets.all():
        print(bucket.name)

Running Tests

You can run tests in all supported Python versions using tox. By default, it will run all of the unit and functional tests, but you can also specify your own pytest options. Note that this requires that you have all supported versions of Python installed, otherwise you must pass -e or run the pytest command directly:

$ tox
$ tox -- unit/test_session.py
$ tox -e py26,py33 -- integration/

You can also run individual tests with your default Python version:

$ pytest tests/unit

Getting Help

We use GitHub issues for tracking bugs and feature requests and have limited bandwidth to address them. Please use these community resources for getting help:

Contributing

We value feedback and contributions from our community. Whether it's a bug report, new feature, correction, or additional documentation, we welcome your issues and pull requests. Please read through this CONTRIBUTING document before submitting any issues or pull requests to ensure we have all the necessary information to effectively respond to your contribution.

Maintenance and Support for SDK Major Versions

Boto3 was made generally available on 06/22/2015 and is currently in the full support phase of the availability life cycle.

For information about maintenance and support for SDK major versions and their underlying dependencies, see the following in the AWS SDKs and Tools Shared Configuration and Credentials Reference Guide:

More Resources

boto3's People

Contributors

aburmesedev avatar aws-sdk-python-automation avatar awstools avatar danielgtaylor avatar dependabot[bot] avatar dlm6693 avatar dstufft avatar jamesls avatar jdufresne avatar jimhorng avatar jogusd avatar jonathan343 avatar jonathanwcrane avatar jonemo avatar jordonphillips avatar jschwarzwalder avatar kdaily avatar kyleknap avatar laren-aws avatar mtdowling avatar nateprewitt avatar rayluo avatar scalwas avatar shepazon avatar shigemk2 avatar sir-sigurd avatar stealthycoin avatar swetashre avatar thomas-barton avatar tim-finnigan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

boto3's Issues

Unable to create client by passing service_name as unicode

It is impossible to create a client if you pass a service_name as unicode in python2. You may check it yourself by running this:

import boto3
rds = boto3.client(u'rds')

Error also shows up with from __future__ import unicode_literals:

from __future__ import unicode_literals
import boto3
rds = boto3.client('rds')

The code above results with following traceback:

Traceback (most recent call last):
  File "secret_script.py", line 114, in <module>
    main()
  File "secret_script.py", line 51, in main
    rds = boto3.client(u'rds', region_name=REGION_NAME)
  File "/home/orgkhnargh/.virtualenvs/7494e79086e46ba7/local/lib/python2.7/site-packages/boto3/__init__.py", line 79, in client
    return _get_default_session().client(*args, **kwargs)
  File "/home/orgkhnargh/.virtualenvs/7494e79086e46ba7/local/lib/python2.7/site-packages/boto3/session.py", line 217, in client
    aws_session_token=aws_session_token)
  File "/home/orgkhnargh/.virtualenvs/7494e79086e46ba7/local/lib/python2.7/site-packages/botocore/session.py", line 836, in create_client
    client_config=config)
  File "/home/orgkhnargh/.virtualenvs/7494e79086e46ba7/local/lib/python2.7/site-packages/botocore/client.py", line 50, in create_client
    cls = self.create_client_class(service_name)
  File "/home/orgkhnargh/.virtualenvs/7494e79086e46ba7/local/lib/python2.7/site-packages/botocore/client.py", line 64, in create_client_class
    cls = type(service_name, (BaseClient,), methods)
TypeError: type() argument 1 must be string, not unicode

Boto3 version: 0.0.9
Python version: 2.7.6

This may be botocore issue. Should I dig into this issue, add test for this, fix it, etc? Or it would be better to leave it for botocore developers?

Missing API call for cognito-identity SetIdentityPoolRoles

Recently I noticed that when I go view my cognito pools in the console, I'm prompted to set the roles. Previously they were set (or so I thought) via the iam call to set them up, but apparently I was mistaken, or things have changed.

I noticed there's a new API: http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_SetIdentityPoolRoles.html

This API appears to not be supported by boto3 yet. Can you please add it?
Regard,
jan

S3 multipart upload: attributes swapped?

Hello! I'm not sure whether it's me who's making a mistake here or not (boto3 0.0.7):

>>> import boto3
>>> s3 = boto3.resource('s3')
>>> bucket = s3.Bucket('ubitricity-backup')
>>> o = bucket.Object('test')
>>> mpu = o.initiate_multipart_upload()
>>> mpu
s3.MultipartUpload(bucket_name='ubitricity-backup', object_key='test', id='WYDgB2hJrtoTHxWLVL.yz98bTrL.UCuQAk.iIpNeIm6pIl.CZoZ2Uk7mJmKDokQejymCurdjKBj4IQpJF2VAJmEjvjHfepJyWSGU1P7QVdiaDIvJSX_OK5w0omLPVOcu')
>>> part = mpu.MultipartUploadPart(1)
>>> part
s3.MultipartUploadPart(bucket_name='ubitricity-backup', object_key='WYDgB2hJrtoTHxWLVL.yz98bTrL.UCuQAk.iIpNeIm6pIl.CZoZ2Uk7mJmKDokQejymCurdjKBj4IQpJF2VAJmEjvjHfepJyWSGU1P7QVdiaDIvJSX_OK5w0omLPVOcu', multipart_upload_id='test', part_number=1)
>>> part.upload(Body = 'test')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python2.7/dist-packages/boto3/resources/factory.py", line 379, in do_action
    response = action(self, *args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/boto3/resources/action.py", line 77, in __call__
    response = getattr(parent.meta.client, operation_name)(**params)
  File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 299, in _api_call
    raise ClientError(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (NoSuchUpload) when calling the UploadPart operation: The specified upload does not exist. The upload ID may be invalid, or the upload may have been aborted or completed.

In the MultipartUploadPart object the values for object_key and multipart_upload_id seem inverted.

Do I use this function in a wrong way?

[Feedback] API is not really pythonic

While I do understand the desire to provide methods very similar (if not analogous) to the AWS CLI, I really feel like working with boto3 does not really improve on the AWS CLI. From my point of view, boto3 should not only be a wrapper, but also an adapter to make the CLI something desirable to work with from python. For instance, instead of creating an S3 client with boto3.client('s3'), hide this and use a S3Client class. Instead of managing multi-part upload with the upload id directly, it would be more 'pythonic' to have a MultiPartUpload class, and have the corresponding methods take or return an instance of this class. What is the point of having the Client do operations such as get_bucket_policy(Bucket='string')? Wouldn't it be much better (and intuitive) to have a getter for bucket_policy on the Bucket class? Also, having helper methods would help a lot. Something like Bucket.upload_file(file, path), or along those lines. The point isn't precisely the examples I have pointed out here, the point is the library is hard to work with, it is essentially the same as the CLI. Please note I do not mean any disrespect and this is just, at least from my point of view, constructive criticism.

Need some help with basic stuff

Hi all,

I'm very new to boto and AWS in general (semi-new to Python but loving it!), and I'm trying to use boto3 (because I need Python 2.7 support for S3/Glacier) and am stuck. I read in another thread that it was OK to ask questions here, so I hope that's still true.

I am trying to make a simple online 'data dump' for archival purposes. As such, I need to put tar file, list tar files, and retrieve tar files as the core functionality.

I have been able to upload a tar, but cannot seem to get the list function to work. I haven't gotten to the 'get' function yet as a result. Here's what I have:


import boto3
import os

os.environ["aws_access_key_id"] = 'xxxxxxxxxxxxxxx'
os.environ["aws_secret_access_key"] = 'yyyyyyyyyyyyy'

s3 = boto3.resource('s3')
s3c = boto3.client('s3')

obj = boto3.resource('s3').Object('archive', '66784D57-3E8B-4A4F-9654-22262445F9FA.tar')
obj.put(Body=open('C:\Users\TropicMike\PycharmProjects\glacier\66784D57-3E8B-4A4F-9654-22262445F9FA.tar', 'rb'))

response = s3c.list_objects(Bucket='archive')
for i in response:
    print i

When I do the list_objects, I just get this output:


MaxKeys
Prefix
Name
ResponseMetadata
Marker
IsTruncated
Contents

Can someone smack me with the clue-by-four and help me see the what I'm doing wrong? I just want to be able to get the list of tars/other files stored in the bucket, along with maybe their size/date info.

Thanks!

completing multipart upload

I'm having trouble with completing a multipart upload

given the following test code

mp = s.create_multipart_upload(Bucket='datalake.primary', Key='test1')
uid = mp['UploadId']
p1 =s.upload_part(Bucket='datalake.primary', Key='test1', PartNumber=1, UploadId=uid, Body='part_0')
s.complete_multipart_upload(Bucket='datalake.primary', Key='test1', UploadId=uid, MultipartUpload=???)

I don't know what I'm supposed to be setting MultipartUpload to and can't work it out in the docs. I see it needs to be a dict but not sure what it should contain.

Without it, I get the error ClientError: An error occurred (InvalidRequest) when calling the CompleteMultipartUpload operation: You must specify at least one part

Using non-default profile

Is there a simple way to use a non-default profile by specifying it's name?
Like on boto there is

boto.connect_ec2(profile_name="dev-profile")

I see that I can construct a session with credentials and region, but I don't see an ability to just refer to the profile's name

KMS generate_data_key results in _api_call() error

When I try to call generate_data_key on a KMS client with the following code I get and error saying I've provided too many parameters to _api_call()

import boto3
kms = boto3.client('kms')
datakey = kms.generate_data_key('my_data_key_alias')

The same error occurs when trying other calls such as
kms.get_key_policy('my_data_key_alias, "default")

iam-2010-05-08.resources.json doesn't seem to have support for Managed Policies

I could be incorrect, but I believe iam-2010-05-08.resources.json needs to be regenerated to have the new methods for interacting with Managed Policies.

boto/boto#2956

Are there instructions for generating these JSON files?

https://github.com/boto/boto3/blob/master/boto3/data/resources/iam-2010-05-08.resources.json

The version from AWS has remained the same: (API Version 2010-05-08)

http://docs.aws.amazon.com/IAM/latest/APIReference/API_Operations.html

Letter "u" between Key and Value pairs in ResponseMetadata

Hi, for the response data that I am recieving the letter "u" appears between key & value pairs in my dictionary (the returned data). Is this just a failure of an escape character or is it something else?

My code:

import boto3

ec2 = boto3.resource('ec2')
instances = []


for status in ec2.meta.client.describe_instance_status()['InstanceStatuses']:
    instances.append(status['InstanceId'])

def filterInstances(instances):
    filtertemplate = [{'Name': 'resource-id','Values': instances}]
    return filtertemplate

for instance in instances:
    tags = ec2.meta.client.describe_tags(Filters=filterInstances(instances))

print(tags)

Here's my response data:

{'ResponseMetadata': {'HTTPStatusCode': 200, 'RequestId': 'c41f8fca-0ec8-473b-86a3-37dc9b189763'}, u'Tags': [{u'ResourceType': 'instance', u'ResourceId': 'i-xxxxxxxx', u'Value': 'xx', u'Key': 'Category'}, {u'ResourceType': 'instance', u'ResourceId': 'i-xxxxxxxx', u'Value': 'QA', u'Key': 'xxxxxxx'}]}

kms#decrypt throws 'utf8' codec error

kms#decrypt throws this error every time:

UnicodeDecodeError: 'utf8' codec can't decode byte 0xf2 in position 2: invalid continuation byte

Repro steps:

import boto3
kms = boto3.client('kms')
result = kms.encrypt(KeyId='some_key_id_redacted',Plaintext='my name is bob')
kms.decrypt(CiphertextBlob=result['CiphertextBlob'])

May be related to this issue

importexport.Client missing GenerateShippingLabel API Call

This is related to boto/botocore#431 which contains more background information.

According to http://docs.aws.amazon.com/AWSImportExport/latest/DG/WebCommands.html and http://docs.aws.amazon.com/AWSImportExport/latest/DG/WebGenerateShippingLabel.html the GenerateShippingLabel API call is missing. It is also missing in botocore as mentioned above. It is also missing from AWS CLI v1.6.10.

importexport.generate_shipping_label is missing:

    In [76]: importexport = boto3.client('importexport')

    In [77]: importexport.
    importexport.can_paginate   importexport.get_status
    importexport.cancel_job     importexport.get_waiter
    importexport.clone_client   importexport.list_jobs
    importexport.create_job     importexport.update_job
    importexport.get_paginator  importexport.waiter_names

Price history not correct?

I used "client.describe_spot_price_history()" with certain parameters and the response I got was different from what I can see from my personal console. Is there anything I need to take care when use this API to get the correct response?

Decryption fails from a different instance

Hi,

I am trying to encrypt on one instance and decrypt on another, however, it fails to decrypt on the second instance. I am using the following two functions:

    def encryptFile(self):
            with open(self.file2enc,"rb") as input_file:
                    encoded_file = base64.b64encode(input_file.read())

            #check if we are not trying to encrypt larger than 4kb
            if sys.getsizeof(encoded_file) > 4096:
                    print "File %s too large to encrypt" % (file2enc)
                    sys.exit(1)
            #encryption
            self.kms = boto3.client("kms")
            e = self.kms.encrypt(KeyId=self.keyid,Plaintext=encoded_file)
            self.objectBody = base64.b64encode(e['CiphertextBlob']).decode("utf-8")


    def decryptFile(self,enc_string):
            enc_string=self.encrypted_content
            self.kms = boto3.client("kms")

            print enc_string
            #decrypt
            decrypted_file = self.kms.decrypt(CiphertextBlob=enc_string)
            return decrypted_file['Plaintext']

If I run those on instance 1 it works correctly, however, if I encrypt on instance 1 and decrypt on instance 2 I get:

kms=boto3.client("kms")
kms.decrypt(CiphertextBlob='CiDLU0Ooom2Wauq+kF0F9cMkyN/tO50IzD4aTn0RUeynBhKJAQEBAgB4y1NDqKJtlmrqvpBdBfXDJMjf7TudCMw+Gk59EVHspwYAAABgMF4GCSqGSIb3DQEHBqBRME8CAQAwSgYJKoZIhvcNAQcBMB4GCWCGSAFlAwQBLjARBAzVBeU8nmuhfVzUoUUCARCAHRma/bCWarfMFZJNqkfmcN8AVTacxPCiu44lNi80')
Traceback (most recent call last):
File "", line 1, in
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 249, in _api_call
raise ClientError(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (InvalidCiphertextException) when calling the Decrypt operation: None

Both instances use the same IAM role. Is this a bug or I am missing the obvious ?

Thanks!

CloudFormation waiters

Are there any plans on adding waiters to the cloudformation client?

It would be useful to check when CRUD operations on the stack finished. E.g when the stack status is UPDATE_COMPLETE after updating the stack

 boto3.client('cloudformation').waiter_names
 []

Auto Scaling - describe_tags() fails to paginate

I'm using boto3 version 0.0.6 to try and list all tags for all auto scaling groups like this:

asg_tags_paginator = autoscale_client.get_paginator('describe_tags')
for page in asg_tags_paginator.paginate():
for tag in page['Tags']:
And it seem to fail to get all tags from all auto scaling groups.

Here's a better example where 'NextToken' does not really contain a token...

Python 2.7.5 (default, Mar 9 2014, 22:15:05)
[GCC 4.2.1 Compatible Apple LLVM 5.0 (clang-500.0.68)] on darwin
Type "help", "copyright", "credits" or "license" for more information.

import boto3
autoscale_client = boto3.client('autoscaling', region_name='eu-west-1')
response = autoscale_client.describe_tags()
print response['NextToken']
{"resourceId":"","tag":""}

Hope it's not just my machine :-)

Thanks!

Repeated messages when receiving with SQS

I have a queue with 3 messages, for testing, test1, test2 and test3. I create a method print_messages() with contents

for message in queue.receive_messages(MaxNumberOfMessages=10): 
    print("Messages: " + message.message_id + ": " + message.body). 

Running print_messages() then yields output like:

>>> print_messages()
Message 0554838d-9fe9-467a-8b05-bc6f6d3d54cd: test1

Message 0554838d-9fe9-467a-8b05-bc6f6d3d54cd: test1

Message 0554838d-9fe9-467a-8b05-bc6f6d3d54cd: test1

>>> print_messages()
Message 2e1f4bd0-1638-4a9a-b20d-02788ee04723: test3

Message 2e1f4bd0-1638-4a9a-b20d-02788ee04723: test3

Message 2e1f4bd0-1638-4a9a-b20d-02788ee04723: test3

>>> print_messages()
Message 2d4145c8-1317-41fa-bb56-77d1f9ea5ee6: test2

Message 2d4145c8-1317-41fa-bb56-77d1f9ea5ee6: test2

Message 2d4145c8-1317-41fa-bb56-77d1f9ea5ee6: test2

>>> print_messages()
Message 0554838d-9fe9-467a-8b05-bc6f6d3d54cd: test1

Message 0554838d-9fe9-467a-8b05-bc6f6d3d54cd: test1

Message 0554838d-9fe9-467a-8b05-bc6f6d3d54cd: test1

>>> print_messages()
Message 2d4145c8-1317-41fa-bb56-77d1f9ea5ee6: test2

Message 2d4145c8-1317-41fa-bb56-77d1f9ea5ee6: test2

Message 2d4145c8-1317-41fa-bb56-77d1f9ea5ee6: test2

>>> print_messages()
Message 2e1f4bd0-1638-4a9a-b20d-02788ee04723: test3

Message 2e1f4bd0-1638-4a9a-b20d-02788ee04723: test3

Message 2e1f4bd0-1638-4a9a-b20d-02788ee04723: test3

>>> print_messages()
Message 2e1f4bd0-1638-4a9a-b20d-02788ee04723: test3

Message 2e1f4bd0-1638-4a9a-b20d-02788ee04723: test3

Message 2e1f4bd0-1638-4a9a-b20d-02788ee04723: test3

Strange behavior when trying to create an S3 bucket in us-east-1

Version info:
boto3 = 0.0.19 (from pip)
botocore = 1.0.0b1 (from pip)
Python = 2.7.9 (from Fedora 22)

I have no problem creating S3 buckets in us-west-1 or us-west-2, but specifying us-east-1 gives InvalidLocationConstraint

>>> conn = boto3.client("s3")
>>> conn.create_bucket(
    Bucket='testing123-blah-blah-blalalala', 
    CreateBucketConfiguration={'LocationConstraint': "us-east-1"})
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.7/site-packages/botocore/client.py", line 200, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/usr/lib/python2.7/site-packages/botocore/client.py", line 255, in _make_api_call
    raise ClientError(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (InvalidLocationConstraint) when calling the CreateBucket operation: The specified location-constraint is not valid

Also trying with a s3 client connected directly to us-east-1:

>>> conn = boto3.client("s3", region_name="us-east-1")
>>> conn.create_bucket(Bucket='testing123-blah-blah-blalalala', CreateBucketConfiguration={'LocationConstraint': "us-east-1"})
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.7/site-packages/botocore/client.py", line 200, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/usr/lib/python2.7/site-packages/botocore/client.py", line 255, in _make_api_call
    raise ClientError(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (InvalidLocationConstraint) when calling the CreateBucket operation: The specified location-constraint is not valid

When I do not specify a region, the bucket is created in us-east-1 (verified in the web console):

>>> conn.create_bucket(Bucket='testing123-blah-blah-blalalala')
{u'Location': '/testing123-blah-blah-blalalala', 'ResponseMetadata': {'HTTPStatusCode': 200, 'HostId': 'Qq2CqKPm4PhADUJ8X+ngxxEE3yRrsT3DOS4TefgzUpYBKzQO/62cQy20yPa1zs7l', 'RequestId': '06B36B1D8B1213C8'}}

...but the bucket returns None for LocationConstraint:

>>> conn.get_bucket_location(Bucket='testing123-blah-blah-blalalala')
{'LocationConstraint': None, 'ResponseMetadata': {'HTTPStatusCode': 200, 'HostId': 'nBGHNu30A/m/RymzuoHLiE2uWuzCsz3v1mcov324r2sMYX7ANq1jOIR0XphWiUIAxDwmxTOW8eA=', 'RequestId': '53A539CC4BCA08C4'}}

us-east-1 is listed as a valid region when I enumerate the regions:

>>> conn = boto3.client("ec2", region_name="us-east-1")
>>> [x["RegionName"] for x in conn.describe_regions()["Regions"]]
['eu-central-1', 'sa-east-1', 'ap-northeast-1', 'eu-west-1', 'us-east-1', 'us-west-1', 'us-west-2', 'ap-southeast-2', 'ap-southeast-1']

Trying to access SQS message body throws an exception as of boto3 versions 0.0.3 and 0.0.2

import boto3

queue_url = ""

sqs = boto3.resource('sqs')
queue = sqs.Queue(queue_url)
messages = queue.receive_messages()
msg = messages[0]
print(msg.body)

It gives:

 File "****", line 31, in perform_action
    print(msg.body)
  File "****lib/python2.7/site-packages/boto3/resources/factory.py", line 262, in property_loader
    '{0} has no load method'.format(self.__class__.__name__))
ResourceLoadException: sqs.Message has no load method

It worked in 0.0.1

listing the top level contents of a s3 bucket with Prefix and Delimiter

Apologies for what sounds like a very basic question. In this example from the s3 docs is there a way to list the continents? I was hoping this might work, but it doesn't seem to:

import boto3

s3 = boto3.resource('s3')
bucket = s3.Bucket('edsu-test-bucket')

for o in bucket.objects.filter(Delimiter='/'):
    print(o.key)

However, the equivalent code using boto2 does seem to work the way I expect:

import boto

s3 = boto.connect_s3()
bucket = s3.get_bucket('edsu-test-bucket')

for o in bucket.list(delimiter='/'):
    print(o.name)

Equivalent of set_contents_from_filename

Sorry if this isn't the right place to ask this question. I can't find a boto3 mailing list.

How do I stream content into S3 from a file using boto3 without reading the whole file's contents into memory? I'm looking for the equivalent of set_contents_from_filename in boto.

Thanks,

Russ

NotAuthorizedException trying to call cognito.set_identity_pool_roles()

Not sure this is the right place to ask, but I'm also not sure WHERE to ask this. I'm now getting:

botocore.exceptions.ClientError: An error occurred (NotAuthorizedException) when calling the SetIdentityPoolRoles operation: Access to Role 'SomeRoleName' is forbidden.

I created the role just a few lines previous in the python script using

 iam = session.client('iam');
 iam.create_role(..)

and now trying to call

cognito = session.client('cognito-identity', region_name='us-east-1')
cognito.set_identity_pool_roles(..)

Am I doing something wrong? Seems the same credentials that created the role should be able to reference it.

Any ideas?

Futures dependency broken on 2.7 in some cases.

Currently, the wheel package that is installed for boto3 version 1.9 contains this line in it's METADATA:

Requires-Dist: futures (==2.2.0); extra == ':python_version=="2.6" or python_version=="2.7"'

The latest version of pip (7.0.3) on Windows ignores this line and the futures dependency is not installed (see longaccess/bigstash-python#23).

When I added the same extras_require line as boto3 to my own package's setup.py and run bdist_wheel I get this slightly different line (with wheel==0.24.0):

Requires-Dist: futures (==2.0.0); python_version=="2.6" or python_version=="2.7"

In this case, installing the wheel properly installs futures.

Git tags aren't aligned with pypi packages

I was having a bug with SQS between versions 2.30.0 and 2.31.0 with messages getting corrupted. I decided to check out the diff so I cloned the repo and did git diff 2.30.0 2.31.0. The only changes between the tags are related to cloudwatch, very confusing. After rechecking my work a few times, I downloaded the two packages via pip install --download-cache=".", unzipped the packages and diffed them. To my surprise there were much more changes between the releases, with 96cd2800 being what was causing my issues. I just thought I drop you a line and let you know that something in the deployment process is screwy.

Futures not listed as dependency for python 2.7

When running on Python 2.7, boto3.s3.transfer.S3Transfer has a dependency on "futures" package (a back port of Python 3.x concurrent.future) which is not installed by default with a "pip install boto3"

Wrong response for spot price history for school sub-account

I asked you about the spot price history correctness yesterday. And I actually found that the response with the key id and secret key of a sub-account of my school account were different from the response I got with my own key id and secret key of my own account. And the former returned the wrong results and the latter got the right results. And to remind, here is the link I reported an issue yesterday: #122. I think the response should be the same, maybe you need to check on it. I have tried boto, boto3, and AWS CLI, all returned the same results.

s3 list_objects - no ContentType?!?

Would it be at all possible to make the s3 client's list_objects method return the ContentType for each object? The get_object and head_object methods both do this.

I suspect wanting to know an object's type would be a fairly common requirement when performing a list operation.

Is there even a way to get the content type, in bulk, with the current boto 3 s3 api? Obviously doing a head request on each object individually is not practical.

Understanding read the docs

Where do I find the additional information not covered in boto3 read the docs?

I'm using EMR and need to use add_job_flow_steps but I don't know what the structure of steps list needs to be. I know how to do it using aws cli.

Is there an additional resource I should be looking at which has this information about parameter structure requirements when using boto3?

s3.Bucket has no attribute load

Attempting to load the bucket's creation_date results in a ResourceLoadException.

>>> import sys
>>> sys.version
'2.7.6 (default, Sep  9 2014, 15:04:36) \n[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.39)]'
>>> import boto3
>>> s3 = boto3.resource('s3')
>>> b = s3.Bucket('mybucket')
>>> b.name
'mybucket'
>>> b.creation_date
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.7/site-packages/boto3/resources/factory.py", line 202, in property_loader
    '{0} has no load method'.format(self.__class__.__name__))
boto3.exceptions.ResourceLoadException: s3.Bucket has no load method
$ pip show boto3

---
Name: boto3
Version: 0.0.15

Route53Domains list_domains method fails

boto3 0.0.7

>>> import boto3
>>> r53 = boto3.client('route53domains')
>>> r53.list_domains()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/aws/lib/lib/python2.7/site-packages/botocore/client.py", line 238, in _api_call
    operation_model, request_dict)
TypeError: 'NoneType' object is not iterable

s3 resource has no attribute 'Object' in boto3 0.0.7

I just upgraded to boto3 0.0.7, and now I'm getting an error when trying to call s3.Object().

s3 = boto3.resource('s3')
obj = s3.Object('my-bucket', 'my-key')

That throws an AttributeError: "AttributeError: 's3' object has no attribute 'Object'"

If I go back to 0.0.6 or 0.0.5, I have no problems accessing s3.Object.

Was there a breaking change here? I didn't see anything in the release notes or changes in the docs.

Unauthenticated cognito-identity get_id access?

I can't seem to figure out how to make a cognito-identity.get_id(...) call. This call does not require an access-key or secret. Is there a way to do this? Or is this out of scope for boto3?

Question about introspection

I can easily find the available service resources give a session:

>>> import boto3.session
>>> session = boto3.session.Session(profile_name='prod')
>>> session.available_resources
['cloudformation',
 'dynamodb',
 'ec2',
 'glacier',
 'iam',
 'opsworks',
 's3',
 'sns',
 'sqs']

But is there a way to find the available resources for a given service resource? I.E. is there some way to get the equivalent of:

>>> ec2 = session.resource('ec2')
>>> ec2.available_resources

I've been going through the code for a while and can't really figure out an easy way to do this.

S3 object property "size" not available

I'm using boto3-0.0.7. I'm expecting this code to output the sizes of the keys in the bucket:

s3 = boto3.resource('s3')
bucket = s3.Bucket(BUCKET)
for o in bucket.objects.all():
  print o.size

instead I get this error:

Traceback (most recent call last):
  File "./bug_report", line 10, in <module>
    print o.size
  File "/home/glacier/lib/lib/python2.7/site-packages/boto3/resources/factory.py", line 257, in property_loader
    '{0} has no load method'.format(self.__class__.__name__))
boto3.exceptions.ResourceLoadException: s3.ObjectSummary has no load method

I'm using a bucket in the eu-central-1 region.

S3 object.put() throws NoneType exception with no network connection

I was attempting to test some failure logic by calling an S3 Object's put method with no network connection. Since the response isn't really documented I was expecting an empty response, however instead I received this stack trace:

  File "Documents/development/amazon-aws/aws/uploader.py", line 63, in upload_file
    response = upload_file.put(Body=file_handle)
  File "Documents/development/virtualenvs/potree-converter3/lib/python3.4/site-packages/boto3/resources/factory.py", line 379, in do_action
    response = action(self, *args, **kwargs)
  File "Documents/development/virtualenvs/potree-converter3/lib/python3.4/site-packages/boto3/resources/action.py", line 77, in __call__
    response = getattr(parent.meta.client, operation_name)(**params)
  File "Documents/development/virtualenvs/potree-converter3/lib/python3.4/site-packages/botocore/client.py", line 292, in _api_call
    operation_model, request_dict)
TypeError: 'NoneType' object is not iterable

Here's the code that caused it (again, with no network connection):

#create the object to put
upload_file = self.bucket.Object(key)

#upload the file object
response = upload_file.put(Body=file_handle)

method generate_url (get temporary url)

With boto library (not boto3), there is method generate_url which generates temporary url for given object.
This method is availabe on connection, bucket and key objects.

import boto
con = boto.connect_s3()
tmpurl_c = con.generate_url(...)
bucket = con.get_bucket("mybucket")
tmpurl_b = bucket.generate_url(...)
key = bucket.get_object("key/name.txt")
tmpurl_k = key.generate_url(...)

I was unable to find corresponding method in boto3. Is there such a method? Or is it planned?

I am aware, that actually creating the temporary url is not a call to AWS web service (so I do not expect it to be found in AWS serices model), but for practical use it is very important.

Fortunatelly, today we may se boto to do this work, but if boto3 is supposed to be complete replacement to today boto, ability to generate temporary url seems to be important.

Just return part of requested price history

When use describe_spot_price_history() to get all history, I just get the part of price history and it seemed that it was because sometimes, the returned parameter "NextToken" is an empty string where it should return a non-empty string for the next token to get the remaining price. And sometimes, when there were more than 1000 pieces of history, it returned less than 1000 pieces of history for each token.

S3 multipart upload: how to add parts?

Hi! I'm not sure how to do this, could you clarify? (boto3 0.0.7)

>>> import boto3
>>> s3 = boto3.resource('s3')
>>> bucket = s3.Bucket('ubitricity-backup')
>>> o = bucket.Object('test')
>>> mpu = o.initiate_multipart_upload()
>>> part = mpu.MultipartUploadPart(1)
>>> part.upload('datadatadata')
{u'ETag': '"d41d8cd98f00b204e9800998ecf8427e"', 'ResponseMetadata': {'HTTPStatusCode': 200, 'HostId': 'leCanviHIAh0lS9UursVkiMiyzksNVMAn+jYgyClzp5us8mpCjwB06WEeifjy7LX', 'RequestId': '6C5B491C7D0E7060'}}
>>> mpu.complete()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python2.7/dist-packages/boto3/resources/factory.py", line 379, in do_action
    response = action(self, *args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/boto3/resources/action.py", line 77, in __call__
    response = getattr(parent.meta.client, operation_name)(**params)
  File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 299, in _api_call
    raise ClientError(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (InvalidRequest) when calling the CompleteMultipartUpload operation: You must specify at least one part

I understand the MultipartUploadPart is not being added to the MultipartUpload object. But how is it done? I was looking for a method like mpu.add_part() but there is none.

Thank you!

S3Transfer.upload_file calls callback twice

As per the examples in the docs, passing a callback to S3Transfer.upload_file displays the upload progress. However because botocore.auth.SigV4Auth.payload consumes the request body this actually happens twice, once for calculating the SHA256 of the body and one more when sending the body. This is somewhat cumbersome to workaround, I did it by adding these lines to the Progress.__call__ example:

            if bytes_amount > 0 and self._seen_so_far == self._size:
                # assume we are now only starting the actual transfer
                self._seen_so_far = 0

It would be nice if somehow this could be done a bit better, seems however that botocore has no support for progress reporting and I don't see an obvious way to do it correctly (apart from the present solution).

Trouble with server side encryption

I'm using the code (see below) to upload and encrypt a file in s3 using a valid keyAlias (based on the new KMS features)

It fails with the error: 'Requests specifying Server Side Encryption with AWS KMS managed keys require AWS Signature Version 4.'

I have specified '[s3] use-sigv4 = True' in my ~/.aws/config file.

I can get this functionality to work via the Java SDK.

s3 = boto3.resource('s3')
data = open(fileName, 'rb')
s3.Bucket(bucketName).put_object(Key=keyName, Body=data, ServerSideEncryption='aws:kms', SSEKMSKeyId='keyAlias')

S3.Object requires ContentType during put(), but docs say optional

I get a SignatureDoesNotMatch error when trying to put an object without specifying the content type. After turning on logging and looking at the raw Message prop in the response, I saw that S3 was expecting the client to sign "PUT\napplication/x-www-form-urlencoded\n...", but botocore was signing "PUT\n\n...". Once I specified any content type, the client was able to sign the string correctly put the object.

Using boto 0.0.5 w/ botocore 0.78.0.

SignatureDoesNotMatch with certain uploaded S3 content

I'm just getting going on boto3, but stumbling on an issue. The following comes back with a SignatureDoesNotMatch exception:

import boto3
aws_session = boto3.Session(aws_access_key_id=AWS_ACCESS_KEY,
                            aws_secret_access_key=AWS_SECRET_KEY,
                            region_name='us-west-1')
s3 = aws_session.resource('s3')
b = s3.Bucket(BUCKET)

bad = b'\xe2'
good = "this works"
b.put_object(Key='test', Body=bad)

botocore.exceptions.ClientError: An error occurred (SignatureDoesNotMatch) when calling the PutObject operation

Am I missing something obvious?

No module named s3.inject

Using the latest boto3 (0.0.14) I'm getting an ImportError for most API calls related to s3. An example - this line:

obj = boto3.resource('s3').Object(bucket, key)

produces

ImportError: No module named s3.inject

Going back to boto3 0.0.13 fixes the problem.

String manipulation required for Route 53

In order to contribute to the project, get a better understanding of how Boto3 and Botocore is interconnected and get a better understanding of how this code generation works, I have attempted to add a resource definition for Route 53.

I do however hit a problem when I list the hosted zones and then try to do something with them. My current JSON file for Route 53 can be found here, and an example of the problem can be seen here:

>>> import boto3
>>> route53 = boto3.resource('route53')
>>> zones = list(route53.hosted_zones.all())
>>> zones
[route53.HostedZone(id='/hostedzone/Z3HVO9WA3FVVP1')]
>>> zones[0].id
'/hostedzone/Z3HVO9WA3FVVP1'
>>> zones[0].delete()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/jeppe/.pyenv/versions/boto/lib/python3.4/site-packages/boto3/resources/factory.py", line 377, in do_action
    response = action(self, *args, **kwargs)
  File "/Users/jeppe/.pyenv/versions/boto/lib/python3.4/site-packages/boto3/resources/action.py", line 78, in __call__
    response = getattr(parent.meta['client'], operation_name)(**params)
  File "/Users/jeppe/.pyenv/versions/boto/lib/python3.4/site-packages/botocore/client.py", line 246, in _api_call
    raise ClientError(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (400) when calling the DeleteHostedZone operation: Bad Request

As the output shows, the ID I get back for the zone in the listing is /hostedzone/Z3HVO9WA3FVVP1 but the API for deleting a hosted zone expects to be called with an ID such as Z3HVO9WA3FVVP1. I don't see a way in the resource files for the other products of how this string manipulation can be done, so I wonder if that might not be a topic that has been addressed yet?

It would of course have been easier if the Route 53 API actually returned the ID in the same format as it expects to get it back in, but that might not be easy to get Amazon to change that or add the bare ID to a new field in the output.

kms.decrypt CiphertextBlob results in InvalidCiphertextException

I am using the following code to get a data key and as I understand the documentation this gives me the key in plaintext and an encrypted blob to be stored with the encrypted data for future decryption.

data_key = kms.generate_data_key(KeyId=keyId, KeySpec='AES_256')
data_key_plain = data_key['Plaintext']
cipher = data_key['CiphertextBlob']

Based on my understanding I should be able to use the following call to get the plainText key from the ciphertextblob

decrypted_ciphertext = kms.decrypt(CiphertextBlob=cipher)

but i get ClientError: An error occurred (InvalidCiphertextException) when calling the Decrypt operation: None

Am I making any obvious errors in my assumptions?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.