Coder Social home page Coder Social logo

botocore's Introduction

#### Deprecation notice ####

This package is no longer maintained and has been replaced by Boto3. Issues and pull requests are not reviewed. If you are having an issue with the Boto3 package or the AWS CLI, please open an issue on their respective repositories.

boto

boto 2.49.0

Released: 11-July-2018

image

Introduction

Boto is a Python package that provides interfaces to Amazon Web Services. Currently, all features work with Python 2.6 and 2.7. Work is under way to support Python 3.3+ in the same codebase. Modules are being ported one at a time with the help of the open source community, so please check below for compatibility with Python 3.3+.

To port a module to Python 3.3+, please view our Contributing Guidelines and the Porting Guide. If you would like, you can open an issue to let others know about your work in progress. Tests must pass on Python 2.6, 2.7, 3.3, and 3.4 for pull requests to be accepted.

Services

At the moment, boto supports:

  • Compute
    • Amazon Elastic Compute Cloud (EC2) (Python 3)
    • Amazon Elastic Map Reduce (EMR) (Python 3)
    • AutoScaling (Python 3)
    • Amazon Kinesis (Python 3)
    • AWS Lambda (Python 3)
    • Amazon EC2 Container Service (Python 3)
  • Content Delivery
    • Amazon CloudFront (Python 3)
  • Database
    • Amazon Relational Data Service (RDS)
    • Amazon DynamoDB (Python 3)
    • Amazon SimpleDB (Python 3)
    • Amazon ElastiCache (Python 3)
    • Amazon Redshift (Python 3)
  • Deployment and Management
    • AWS Elastic Beanstalk (Python 3)
    • AWS CloudFormation (Python 3)
    • AWS Data Pipeline (Python 3)
    • AWS Opsworks (Python 3)
    • AWS CloudTrail (Python 3)
    • AWS CodeDeploy (Python 3)
  • Administration & Security
    • AWS Identity and Access Management (IAM) (Python 3)
    • AWS Key Management Service (KMS) (Python 3)
    • AWS Config (Python 3)
    • AWS CloudHSM (Python 3)
  • Application Services
    • Amazon CloudSearch (Python 3)
    • Amazon CloudSearch Domain (Python 3)
    • Amazon Elastic Transcoder (Python 3)
    • Amazon Simple Workflow Service (SWF) (Python 3)
    • Amazon Simple Queue Service (SQS) (Python 3)
    • Amazon Simple Notification Server (SNS) (Python 3)
    • Amazon Simple Email Service (SES) (Python 3)
    • Amazon Cognito Identity (Python 3)
    • Amazon Cognito Sync (Python 3)
    • Amazon Machine Learning (Python 3)
  • Monitoring
    • Amazon CloudWatch (EC2 Only) (Python 3)
    • Amazon CloudWatch Logs (Python 3)
  • Networking
    • Amazon Route53 (Python 3)
    • Amazon Route 53 Domains (Python 3)
    • Amazon Virtual Private Cloud (VPC) (Python 3)
    • Elastic Load Balancing (ELB) (Python 3)
    • AWS Direct Connect (Python 3)
  • Payments and Billing
    • Amazon Flexible Payment Service (FPS)
  • Storage
    • Amazon Simple Storage Service (S3) (Python 3)
    • Amazon Glacier (Python 3)
    • Amazon Elastic Block Store (EBS)
    • Google Cloud Storage
  • Workforce
    • Amazon Mechanical Turk
  • Other
    • Marketplace Web Services (Python 3)
    • AWS Support (Python 3)

The goal of boto is to support the full breadth and depth of Amazon Web Services. In addition, boto provides support for other public services such as Google Storage in addition to private cloud systems like Eucalyptus, OpenStack and Open Nebula.

Boto is developed mainly using Python 2.6.6 and Python 2.7.3 on Mac OSX and Ubuntu Maverick. It is known to work on other Linux distributions and on Windows. Most of Boto requires no additional libraries or packages other than those that are distributed with Python. Efforts are made to keep boto compatible with Python 2.5.x but no guarantees are made.

Installation

Install via pip:

$ pip install boto

Install from source:

$ git clone git://github.com/boto/boto.git
$ cd boto
$ python setup.py install

ChangeLogs

To see what has changed over time in boto, you can check out the release notes at http://docs.pythonboto.org/en/latest/#release-notes

Finding Out More About Boto

The main source code repository for boto can be found on github.com. The boto project uses the gitflow model for branching.

Online documentation is also available. The online documentation includes full API documentation as well as Getting Started Guides for many of the boto modules.

Boto releases can be found on the Python Cheese Shop.

Join our IRC channel #boto on FreeNode. Webchat IRC channel: http://webchat.freenode.net/?channels=boto

Join the boto-users Google Group.

Getting Started with Boto

Your credentials can be passed into the methods that create connections. Alternatively, boto will check for the existence of the following environment variables to ascertain your credentials:

AWS_ACCESS_KEY_ID - Your AWS Access Key ID

AWS_SECRET_ACCESS_KEY - Your AWS Secret Access Key

Credentials and other boto-related settings can also be stored in a boto config file. See this for details.

botocore's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

botocore's Issues

Exception initializing botocore session without any other config

I'm attempting to use botocore entirely standalone - no reliance on configuration files, environment variables or iam roles. It looks to me like this ought to work.

If I initialize a boto session like so:

session = botocore.session.get_session({
    'access_key': "myawskeyid",
    'secret_key': "myawssecret",
    'region': "ap-southeast-1",
    })

I get an exception when attempting to load credentials elsewhere:

Traceback (most recent call last):
...
  File "/usr/local/lib/python2.7/dist-packages/botocore/service.py", line 90, in get_endpoint
    return get_endpoint(self, region_name, endpoint_url)
  File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 117, in get_endpoint
    return QueryEndpoint(service, region_name, endpoint_url)
  File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 51, in __init__
    credentials=self.session.get_credentials(),
  File "/usr/local/lib/python2.7/dist-packages/botocore/session.py", line 144, in get_credentials
    metadata)
  File "/usr/local/lib/python2.7/dist-packages/botocore/credentials.py", line 189, in get_credentials
    metadata=metadata)
  File "/usr/local/lib/python2.7/dist-packages/botocore/credentials.py", line 82, in search_iam_role
    metadata = metadata['security-credentials']
KeyError: 'security-credentials'

Fix output in multi result pagination (build_full_result)

Because we use izip_longest you can get a response like this:

{"CommonPrefixes": [null, null, null, null],
 "Content": [{...}, {...}, {...}, {...}
}

When really if the null we shouldn't add it to the list. Then our response should look like:

{"CommonPrefixes": [],
 "Content":  [{...}, {...}, {...}, {...}
}

Can't get IAM role credentials on python3

Trying to access the metadata service gives a traceback of:

Traceback (most recent call last):
  File "c:\temp\aws-cli\awscli\clidriver.py", line 169, in _call
    endpoint_url=self.main_parser.args.endpoint_url)
  File "C:\Temp\venv33\lib\site-packages\botocore-0.10.0-py3.3.egg\botocore\service.py", line 113, in get_endpoint
    return get_endpoint(self, region_name, endpoint_url)
  File "C:\Temp\venv33\lib\site-packages\botocore-0.10.0-py3.3.egg\botocore\endpoint.py", line 223, in get_endpoint
    credentials=service.session.get_credentials(),
  File "C:\Temp\venv33\lib\site-packages\botocore-0.10.0-py3.3.egg\botocore\session.py", line 302, in get_credentials
    metadata)
  File "C:\Temp\venv33\lib\site-packages\botocore-0.10.0-py3.3.egg\botocore\credentials.py", line 188, in get_credential
s
    metadata=metadata)
  File "C:\Temp\venv33\lib\site-packages\botocore-0.10.0-py3.3.egg\botocore\credentials.py", line 80, in search_iam_role

    metadata = _search_md()
  File "C:\Temp\venv33\lib\site-packages\botocore-0.10.0-py3.3.egg\botocore\credentials.py", line 58, in _search_md
    fields = r.content.split('\n')
TypeError: Type str doesn't support the buffer API

I happen to be on a windows machine, but I think this is just a python3 str vs. bytes issue.

incorrect unicode handling in py2.7

here's a traceback on describe snapshots, the issue seems to be that its passing through a unicode object when it think its a unicode encoded string around line 70 of response and it breaks when its not an ascii string. The trivial fix is to just detect the object type and encode(encoding) before passing through to the xml parser.

File "/home/ubuntu/zephyr/local/lib/python2.7/site-packages/botocore/operation.py", line 62, in call
response = endpoint.make_request(self, params)
File "/home/ubuntu/zephyr/local/lib/python2.7/site-packages/botocore/endpoint.py", line 105, in make_request
http_response)
File "/home/ubuntu/zephyr/local/lib/python2.7/site-packages/botocore/response.py", line 380, in get_response
xml_response.parse(body, encoding)
File "/home/ubuntu/zephyr/local/lib/python2.7/site-packages/botocore/response.py", line 70, in parse
parser.feed(s)
UnicodeEncodeError: 'ascii' codec can't encode character u'\ufeff' in position 17793: ordinal not in range(128)

/home/ubuntu/zephyr/local/lib/python2.7/site-packages/botocore/response.py(70)parse()

Model issue with ec2.DescribeNetworkInterfaceAttribute

It appears that the data model used for DescribeNetworkInterfaceAttribute does not line up with the API description: http://docs.aws.amazon.com/AWSEC2/latest/CommandLineReference/ApiReference-cmd-DescribeNetworkInterfaceAttribute.html

Based on how DescribeInstanceAttribute is set up, I am assuming that the model should be an enum that accepts the following attributes:

description
sourceDestCheck
groupSet
atttachment

In it's current state, Amazon appears to reject any requests regardless of the string value you pass in. Please let me know if you need any additional information and thanks in advance for taking a look :).

s3 signature error

Looks like when I refactored the auth module for the sigv4 test suite, I broke the s3 auth class.

Fixing now.

Using botocore without config files, env vars or IAM roles

It's desirable to be able to use botocore entirely standalone - no reliance on configuration files, environment variables or IAM roles.

Currently, it's necessary to do something hacky like this:

session = botocore.session.get_session()
service = session.get_service('ec2')

# HACK manually set the botocore credentials object
session._credentials = botocore.credentials.Credentials(
    access_key=__opts__['AWS.id'],
    secret_key=__opts__['AWS.key'],
    )

endpoint = service.get_endpoint(region)

IAM role credentials not used with empty ~/.boto config

I had a ~/.boto config file that had a [Credentials] section with commented out keys, and instead of moving on to the next potential provider, I get:

# aws iam list-users
No option 'aws_access_key_id' in section: 'Credentials'

In boto, it moves on to the IAM role. I think it makes sense to update botocore to do the same thing.

Can't put contents to s3 bucket outside of US standard region with file object

When you use a file like object as the body param to s3.PutObject for a bucket outside of US standard, you'll get a 307 followed by a 400 bad request and eventually a socket timeout. To repro:

  1. Create a bucket outside of us standard.
  2. Send a PutObject request with a file like object for the body: op.call(endpoint, body=open('/some/file', 'rb'), ...)

You'll see a 307 then a 400. I think this is because requests is configured to follow redirects, but there's no location header and hence the bad request.

I think the fact that this is a file like object exposes the bug because we send two packets, one for the request+headers then one for the body. When the body is a string (like what we use in the integration tests), then httplib will do a msg += message_body and send it as a single chunk which succeeds.

This also might be related to the virtual host handler we have. I noticed this in the log messages:

botocore.handlers: DEBUG: Checking for DNS compatible bucket for: https://s3-us-west-2.amazonaws.com/botocoretest1374195914-317
botocore.handlers: DEBUG: URI updated to: https://botocoretest1374195914-317.s3.amazonaws.com

Would be good to get an integration test written for this also.

Better error message for services that don't exist

Filing this so I don't forget:

>>> import botocore.session
>>> botocore.session.get_session().get_service('badservice')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "botocore/botocore/session.py", line 333, in get_service
    return botocore.service.get_service(self, service_name, provider_name)
  File "botocore/botocore/service.py", line 141, in get_service
    return Service(session, provider_name, service_name)
  File "botocore/botocore/service.py", line 46, in __init__
    self.__dict__.update(sdata)
ValueError: dictionary update sequence element #0 has length 11; 2 is required

I think the root cause might be related to the fact that get_data('aws/badservice') returns a list of available services whereas get_data('aws/ec2') returns the model:

>>> botocore.session.get_session().get_data('aws/badservice')
['autoscaling', 'cloudformation', ...]

S3 PostObject operation

I need it to be described in JSON to be able to presign request for upload from browser (and clients without PUT method support).

PS: Also it would be great to have presigning API method.

Data path not handled correctly on Windows

The data search path is constructed using the __file__ attribute of the base.py module but it doesn't take into account the drive letter on Windows installations. So, installing on C: and then running while on D: will cause an error:

botocore.exceptions.DataNotFoundError: Unable to load data for: cli

Subclassing

It is impossible to customize classes structure without modules monkey patching. IE: to replace Service class with some subclass (MockService) I have to somehow inject function into botocore.service module. Hackish, not good.

It is expected to have factory method (like pattern defines) inside Session class, so it could be alternated in Session subclass. In That case if i need to use subclass of Operation I need to Session, Service and Endpoint. Seems to be very straight-forward and explicit to me.

Maybe this suggestion will cause some redesign. I can make a pull request if you are ok with my idea.

Unquoted profiles are not found

The README for aws-cli says you can have a profile like this:

[profile myprofile]
aws_access_key_id = ...

But if you try to use this profile by specifying --profile myprofile you will get:

NoRegionError: You must specify a region or set the AWS_DEFAULT_REGION environment variable.

If, however, you use:

[profile "myprofile"]

It works fine. There is either a bug in the docs or a bug in the code.

Need an OrderedDict implementation for Python 2.6

We want to use OrderedDict to preserve the order of the keys in the JSON files but 2.6 doesn't have it in stdlib. Need to workaround that and also then use simplejson on 2.6 to use the object_pairs_hook which json in 2.6 does not support.

S3 CopyObject fails with 501

Attempting an aws s3 copy-object command results in a 501 Not Implemented" error. This is caused by theTransfer-Encoding: chunked`` header that requests is adding to the request.

RequestSpotInstances: "The parameter BlockDeviceMappings is not recognized"

Greetings. I'm trying to request a spot instance with ephemeral disks mapped:

import botocore.session

session = botocore.session.get_session()
session.set_debug_logger()
ec2 = session.get_service('ec2')
operation = ec2.get_operation('RequestSpotInstances')
endpoint = ec2.get_endpoint('us-east-1')
response = operation.call(
    endpoint,
    spot_price='1.00',
    instance_count=1,
    launch_specification={
        'image_id': 'ami-33ec795a',
        'instance_type': 'cc2.8xlarge',
        'block_device_mappings': [
            {"device_name": "/dev/sdb", "virtual_name": "ephemeral0"},
            {"device_name": "/dev/sdc", "virtual_name": "ephemeral1"},
            {"device_name": "/dev/sdd", "virtual_name": "ephemeral2"},
            {"device_name": "/dev/sde", "virtual_name": "ephemeral3"}
        ]
    },
)

And I'm getting the following error:

DEBUG - <?xml version="1.0" encoding="UTF-8"?><Response><Errors><Error><Code>UnknownParameter</Code><Message>The parameter BlockDeviceMappings is not recognized</Message></Error></Errors><RequestID>6536e1a9-1314-4e04-af89-9f554ec79237</RequestID></Response>

Digging into the debug output, I see:

DEBUG - label=
DEBUG - label=LaunchSpecification
DEBUG - label=LaunchSpecification.ImageId
DEBUG - label=BlockDeviceMappings.Item.1
DEBUG - label=BlockDeviceMappings.Item.1.DeviceName
DEBUG - label=BlockDeviceMappings.Item.1.VirtualName
DEBUG - label=BlockDeviceMappings.Item.2
DEBUG - label=BlockDeviceMappings.Item.2.DeviceName
DEBUG - label=BlockDeviceMappings.Item.2.VirtualName
DEBUG - label=BlockDeviceMappings.Item.3
DEBUG - label=BlockDeviceMappings.Item.3.DeviceName
DEBUG - label=BlockDeviceMappings.Item.3.VirtualName
DEBUG - label=BlockDeviceMappings.Item.4
DEBUG - label=BlockDeviceMappings.Item.4.DeviceName
DEBUG - label=BlockDeviceMappings.Item.4.VirtualName
DEBUG - label=LaunchSpecification.InstanceType
DEBUG - label=

This doesn't look right. The labels should be, e.g.,

LaunchSpecification.BlockDeviceMappings.1.DeviceName

rather than

BlockDeviceMappings.Item.1.DeviceName

So I made the following simple change:

diff --git a/botocore/parameters.py b/botocore/parameters.py
index c5e7e72..5c8598b 100644
--- a/botocore/parameters.py
+++ b/botocore/parameters.py
@@ -224,7 +224,8 @@ class ListParameter(Parameter):
                 member_name = member_type.xmlname
             else:
                 member_name = 'member'
-            label = '%s.%s' % (self.name, member_name)
+            if member_name != 'Item':
+                label = '%s.%s' % (self.name, member_name)
         for i, v in enumerate(value, 1):
             member_type.build_parameter_query(v, built_params,
                                               '%s.%d' % (label, i),

and now I see

DEBUG - label=
DEBUG - label=LaunchSpecification
DEBUG - label=LaunchSpecification.ImageId
DEBUG - label=LaunchSpecification.BlockDeviceMapping.1
DEBUG - label=LaunchSpecification.BlockDeviceMapping.1.DeviceName
DEBUG - label=LaunchSpecification.BlockDeviceMapping.1.VirtualName
DEBUG - label=LaunchSpecification.BlockDeviceMapping.2
DEBUG - label=LaunchSpecification.BlockDeviceMapping.2.DeviceName
DEBUG - label=LaunchSpecification.BlockDeviceMapping.2.VirtualName
DEBUG - label=LaunchSpecification.BlockDeviceMapping.3
DEBUG - label=LaunchSpecification.BlockDeviceMapping.3.DeviceName
DEBUG - label=LaunchSpecification.BlockDeviceMapping.3.VirtualName
DEBUG - label=LaunchSpecification.BlockDeviceMapping.4
DEBUG - label=LaunchSpecification.BlockDeviceMapping.4.DeviceName
DEBUG - label=LaunchSpecification.BlockDeviceMapping.4.VirtualName
DEBUG - label=LaunchSpecification.InstanceType
DEBUG - label=

and the request succeeds. Not sure if this is the right fix, though. Thoughts?

Thanks!

CommonPrefixes incorrect parsing on ListMultipartUploads

Not sure if this is specific to ListMultipartUploads or more general, but given a response like this (I added the indentation to show the CommonPrefix part more clearly):

<?xml version="1.0" encoding="UTF-8"?>
<ListMultipartUploadsResult xmlns="http://s3.amazonaws.com/doc/2006-03-01/"><Bucket>botocoretest1374528673-218</Bucket><KeyMarker></KeyMarker><UploadIdMarker></UploadIdMarker><NextKeyMarker></NextKeyMarker><NextUploadIdMarker></NextUploadIdMarker><Delimiter>/</Delimiter><Prefix>foo</Prefix><MaxUploads>1000</MaxUploads><IsTruncated>false</IsTruncated>
<CommonPrefixes><Prefix>foo/</Prefix></CommonPrefixes>
<CommonPrefixes><Prefix>foobar/</Prefix></CommonPrefixes>
</ListMultipartUploadsResult>

I get a parsed response like this:

{
    "UploadIdMarker": null,
    "CommonPrefixes": {
        "Prefix": "foobar/"
    },
    "NextKeyMarker": null,
    "Bucket": "botocoretest1374528673-218",
    "Prefix": "foo",
    "NextUploadIdMarker": null,
    "Delimiter": "/",
    "Uploads": [],
    "KeyMarker": null,
    "MaxUploads": 1000,
    "IsTruncated": false
}

I noticed this while working on some of the paginator code. In this scenario I had a number of multipart uploads in foo/<keys> and foobar/<keys>.

Response from EC2 DescribeInstanceAttribute is incorrect

The response parser incorrectly duplicates the one returned value into all possible attributes creating a response like this:

$ aws --region us-east-1 ec2 describe-instance-attribute --instance `wget -q -O- http://169.254.169.254/latest/meta-data/instance-id` --attribute rootDeviceName 
{ 
"UserData": { 
"Value": "/dev/sda1" 
}, 
"ProductCodes": [], 
"InstanceId": "i-7a81571cāœ ", 
"InstanceInitiatedShutdownBehavior": { 
"Value": "/dev/sda1" 
}, 
"RootDeviceName": { 
"Value": "/dev/sda1" 
}, 
"EbsOptimized": { 
"Value": false 
}, 
"BlockDeviceMappings": [], 
"KernelId": { 
"Value": "/dev/sda1" 
}, 
"RamdiskId": { 
"Value": "/dev/sda1" 
}, 
"DisableApiTermination": { 
"Value": false 
}, 
"InstanceType": { 
"Value": "/dev/sda1" 
} 
} 

Can't create multipart upload for s3 in python3

$ aws --debug s3 create-multipart-upload --bucket bucket --key key
Traceback (most recent call last):
  File "aws-cli/awscli/clidriver.py", line 289, in call
    **params)
  File "botocore/botocore/operation.py", line 53, in call
    return endpoint.make_request(self, params)
  File "botocore/botocore/endpoint.py", line 189, in make_request
    prepared_request = self.prepare_request(request)
  File "botocore/botocore/endpoint.py", line 69, in prepare_request
    self.auth.add_auth(request=request)
  File "botocore/botocore/auth.py", line 386, in add_auth
    request.headers)
  File "botocore/botocore/auth.py", line 378, in get_signature
    headers)
  File "botocore/botocore/auth.py", line 369, in canonical_string
    cs += self.canonical_resource(split)
  File "botocore/botocore/auth.py", line 356, in canonical_resource
    qsa.sort(cmp=lambda x, y:cmp(x[0], y[0]))
TypeError: 'cmp' is an invalid keyword argument for this function

The cmp arg to sort() was removed in python3. It'd be great to get some unittests around this as well.

IAM CreateVirtualMFADevice response not parsed correctly

The response is coming back as:

{
    "ResponseMetadata": {
        "RequestId": "17f44a0c-cf13-11e2-aaf0-a16b2a7ddbf0"
    },
    "VirtualMFADevice": {
        "Base32StringSeed": null,
        "SerialNumber": "arn:aws:iam::336924118301:mfa/ExampleMFADevice",
        "QRCodePNG": null
    }
}

The Base32StringSeed and QRCodePNG values are of type blob which is not being handled in the current response parser.

Support for SimpleDB API.

Having great success working with botocore on lots of fronts with the supported APIs.

However, the lack of support for SimpleDB is a hole that forces me to fall back to old school boto when I need to use it. Is there a plan for SimpleDB support in botocore?

Allow auth handler to be passed into Endpoint class

If we move the call to get_auth one level up into the get_endpoint function, this would allow people to pass in whatever auth handler they wanted if creating Endpoints manually.

It would also make it easier to write unit tests.

Can't list objects with unicode chars in key name

Repro steps:

  • Create a key with a unicode char.
  • Try a ListObjects on that bucket (or use the aws s3 list-objects --bucket <name> command).

Traceback:

Traceback (most recent call last):
  File "aws-cli/awscli/clidriver.py", line 175, in main
    return command_table[parsed_args.command](remaining, parsed_args)
  File "aws-cli/awscli/clidriver.py", line 268, in __call__
    return command_table[parsed_args.operation](remaining, parsed_globals)
  File "aws-cli/awscli/clidriver.py", line 703, in __call__
    self._operation_object, call_parameters, parsed_globals)
  File "aws-cli/awscli/clidriver.py", line 775, in invoke
    parsed_globals)
  File "aws-cli/awscli/clidriver.py", line 789, in _display_response
    formatter(operation, response)
  File "aws-cli/awscli/formatter.py", line 52, in __call__
    response_data = response.build_full_result()
  File "botocore/botocore/paginate.py", line 256, in build_full_result
    for vals in zip_longest(*iterators):
  File "botocore/botocore/paginate.py", line 291, in __iter__
    for _, page in self._pages_iterator:
  File "botocore/botocore/paginate.py", line 144, in __iter__
    **current_kwargs)
  File "botocore/botocore/operation.py", line 63, in call
    response = endpoint.make_request(self, params)
  File "botocore/botocore/endpoint.py", line 72, in make_request
    return self._send_request(prepared_request, operation)
  File "botocore/botocore/endpoint.py", line 89, in _send_request
    response, exception = self._get_response(request, operation, attempts)
  File "botocore/botocore/endpoint.py", line 107, in _get_response
    http_response), None)
  File "botocore/botocore/response.py", line 395, in get_response
    xml_response.parse(body, encoding)
  File "botocore/botocore/response.py", line 70, in parse
    parser.feed(s)
UnicodeEncodeError: 'ascii' codec can't encode characters in position 8475-8477: ordinal not in range(128)

Problem using botocore for S3

It seems like the file botocore/data/aws/s3.json is missing? When trying to instantiate the s3 service I get

botocore.exceptions.DataNotFoundError: Unable to load data for: aws/s3

Infinite loop when BOTO_DATA_PATH is used

I attempted to set BOTO_DATA_PATH to experiment with a non-AWS provider, and this code goes into an infinite loop on my system:

135 paths = os.environ['BOTO_DATA_PATH'].split(':')
136 for path in paths:
137 path = os.path.expandvars(path)
138 path = os.path.expanduser(path)
139 paths.append(path)

the loop append to "paths" whether or not expandvars and expanduser result in a different path.

miss "xmlnamespace" field in cloudsearch.json file

When I use awscli tool run describe-domains command, I always get such error:

$ aws cloudsearch describe-domains --domain-name imdb-movies --region us-west-2
'Service' object has no attribute 'xmlnamespace'

I think the problem is the cloudsearch.json file miss the xmlnamespace field, and when I add this field, the 'aws cloudsearch describe-domains' command works better:

diff --git a/botocore/data/aws/cloudsearch.json b/botocore/data/aws/cloudsearch.json
index 971f437..a299ee6 100644
--- a/botocore/data/aws/cloudsearch.json
+++ b/botocore/data/aws/cloudsearch.json
@@ -5,6 +5,7 @@
"signature_version": "v2",
"service_full_name": "Amazon CloudSearch",
"endpoint_prefix": "cloudsearch",

SSL certificate validation error on the Support service

SSL certificate validation is failing for the Support service. This is because the commonName in the certificate is support.us-east-1.amazonaws.com rather than support.amazonaws.com, which is the hostname we are using.

This relates to the same bug we have encountered before in all versions of Python < 2.7.3. The fix is to change our code to use support.us-east-1.amazonaws.com as the hostname.

rest-json doesn't work when parameters are specified

For example, list-pipelines works:

$ aws elastictranscoder list-pipelines
{
    "Pipelines": []
}

But with an arg:

$ aws elastictranscoder list-jobs-by-pipeline --pipeline-id foo --debug
2013-04-26 12:15:21,021 - botocore.base - DEBUG - Attempting to Load: aws/_services/elastictranscoder
2013-04-26 12:15:21,021 - botocore.base - DEBUG - Attempting to Load: aws
2013-04-26 12:15:21,021 - botocore.base - DEBUG - Attempting to Load: aws/_services
2013-04-26 12:15:21,022 - botocore.base - DEBUG - Found data file: /usr/local/Cellar/python/2.7.3/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/botocore/data/aws/_services.json
2013-04-26 12:15:21,022 - botocore.base - DEBUG - Attempting to Load: aws/_services/elastictranscoder
2013-04-26 12:15:21,022 - botocore.base - DEBUG - Attempting to Load: aws/elastictranscoder
2013-04-26 12:15:21,029 - botocore.base - DEBUG - Found data file: /usr/local/Cellar/python/2.7.3/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/botocore/data/aws/elastictranscoder.json
2013-04-26 12:15:21,036 - botocore.credentials - INFO - Found credentials in boto config file
2013-04-26 12:15:21,036 - botocore.operation - DEBUG - {u'pipeline_id': 'foo'}
2013-04-26 12:15:21,036 - botocore.operation - DEBUG - {u'pipeline_id': 'foo'}
2013-04-26 12:15:21,036 - botocore.endpoint - DEBUG - {'headers': {}, 'uri_params': {}, 'payload': None}
2013-04-26 12:15:21,037 - botocore.endpoint - DEBUG - SSL Verify: True
2013-04-26 12:15:21,037 - botocore.endpoint - DEBUG - path: /2012-09-25/jobsByPipeline/{PipelineId}
2013-04-26 12:15:21,037 - botocore.endpoint - DEBUG - query_params: Ascending={Ascending}&PageToken={PageToken}
Traceback (most recent call last):
  File "/usr/local/Cellar/python/2.7.3/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/awscli/clidriver.py", line 289, in call
    **params)
  File "/usr/local/Cellar/python/2.7.3/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/botocore/operation.py", line 53, in call
    return endpoint.make_request(self, params)
  File "/usr/local/Cellar/python/2.7.3/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/botocore/endpoint.py", line 184, in make_request
    uri = self.build_uri(operation, params)
  File "/usr/local/Cellar/python/2.7.3/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/botocore/endpoint.py", line 150, in build_uri
    pc = pc.format(**params['uri_params'])
KeyError: u'PipelineId'

SigV4Auth should not use cached timestamps for signing requests.

Versions

Python Interpreter Version: 2.7.5
Botocore Version: 18.0

Issue Description:

The SigV4Auth class generates a signature timestamp in its init and uses that timestamp for signing all requests for the endpoint. This means that an endpoint can only generate correctly signed requests for 5 minutes before the requests are rejected by AWS services as having an expired timestamp.

For example:

Signature expired: 20131008T001036Z is now earlier than 20131008T001112Z (20131008T001612Z - 5 min.)

Use Case

Ran into this problem when using botocore to construct complex CloudFormation stacks that took extended periods of time to complete. I was getting signature expired errors from CloudFormation when polling for completion of the stack construction after 5 minutes because it was using the same endpoint object to poll as it used to issue the initial CreateStack.

Reproducing the Issue:

session = botocore.session.get_session()
service = session.get_service('ec2')
endpoint = service.get_endpoint('us-east-1')
<wait more than 5 minutes>
operation = svc.get_operation(<method>)
operation.call(endpoint, ...)

This will result in a 403 error with a signature expired error like this:

Signature expired: 20131008T001036Z is now earlier than 20131008T001112Z (20131008T001612Z - 5 min.)

Suggested Fix

Instead of generating a cached signature timestamp in init for SigV4Auth, it should be generated dynamically for each signature operation.

Workaround

The issue can be worked around by calling get_endpoint() for the service before each operation call which will cause a new signature timestamp to be regenerated for requests on the new endpoint and thus avoid the expired timestamp issue.

enable to customize the user-agent name in botocore

Hi, All

AWS has enabled the service "AWS CloudTrail" which cloud capture AWS API activity. In its output, "userAgent" is also recored there.

So i just suggest if we could customize the user-agent name in botocore, i could easily let my specific module with specific user-agent name initialized. Thus when read the output, it will be more easier for me to know which module do the API calling. Most time instance and API call name are not enough to know all the stuff.

I wish you could add this as an enhancement.

Thanks
Henry

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    šŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. šŸ“ŠšŸ“ˆšŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ā¤ļø Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.