aws-beam / aws-erlang Goto Github PK
View Code? Open in Web Editor NEWCreate, configure, and manage AWS services from Erlang code.
License: Other
Create, configure, and manage AWS services from Erlang code.
License: Other
Hi there,
TLDR:
Is there any reason why aws-erlang
does not use aws_signature?
At my company we use aws-erlang
extensively to interact with S3 and we've had several issues with aws-erlang
over the past month or so leading us to consider erlcloud
which is a shame ๐
The problems seem to stem from aws-erlang
implementing it's own version of aws_signature
natively inside aws_request:sign_request/5 rather than use aws_signature
.
List of issues that have been discovered with signatures:
#61
#62
Both of the above issues are fixed in aws_signature
which is also the library of choice for aws-elixir as can be seen here: mix.exs#L30.
In general it seems that there are a few diffs between aws-elixir
and aws-erlang
where fixes in one, are not ported to the other or even at the very least opened up as an issue for someone to grab. The further this goes, the more problematic it will be to figure out the diff and apply those to get back in line with eachother. I for one am not even sure if the fixes that we (from Klarna) have implemented on aws-erlang
are ported to aws-elixir
for example and if they're even applicable there since we don't use Elixir and have no good way of testing this either.
If there's no good reason for aws-erlang
having it's own implementation, I'd like to really comb the code of aws_signature
and see if there are any differences that could lead to a regression and if it's feasible to simply replace aws_request:sign_request/5 with a call to aws_signature:sign_v4/10 with the respective option dance of uri_encode_path
for S3 as that's apparently special :-)
Background
The result from a successful request to aws-erlang typically looks like this:
{ok, Response, {HttpStatusCode, Headers, Ref}} = .. aws_something:something(...),
The Ref
can be used in the API for hackney.
We should not encourage users to call the hackney API as we want to be able to switch from hackney to something else if e.g. Hackney becomes unmaintained.
Solutions
A solution would be to implement a module in aws-erlang where we make the ref opaque and create a wrap module where we tunnel all calls to hackney that could be useful. This would us a little bit more future compatible.
This issue aims to solve AWS S3 testing specifically as it is likely one of the most used modules of aws-erlang.
See the milestone for further details.
The CI pipeline of aws_erlang should include the running of integration tests towards real AWS. Considering that there's a limit to how many requests can be made without incurring costs (which will be blocked), we should be careful regarding running this pipeline to avoid a single user being able to spam the CI and exhaust the limits and avoid any future tests from being able to be run for the remainder of that month.
devel-*
branch is pushed. This way PR's from forks/branches that touch stuff that should be tested towards AWS, can be tested by pushing a devel-<some-commit-that-should-be-tested>
This issue aims to solve AWS Cloudwatch Logs testing specifically as it is likely one of the most used modules of aws-erlang.
See the milestone for further details.
An aws_cloudwatch_logs_SUITE is created which can talk to a real AWS Account and perform certain AWS Requests towards it. A subset of the functionality should be tested to start with.
See: https://docs.aws.amazon.com/AmazonS3/latest/userguide/PresignedUrlUploadObject.html
Certain clients do not have permission to upload objects directly to S3. Hence, a service may need to generate a presigned url that the other client can then use to read/write an object to s3.
The aws_erlang
hex package is owned by @talentdeficit, once the release is created we should reach out and ask them to publish the new version.
Ergo, the docs don't get updated correctly.
I would like to use the API to create an upload URL using aws_erlang
. Unfortunately, it's not clear what are the expected parameters for the function
create_upload_url(Client, Input)
Gets a pre-signed S3 write URL that you use to upload the zip archive when importing a bot or a bot locale.
There might be more functions which require enhanced documentation. Thank you!
Add a build pipeline using GitHub actions.
This is the format that needs to be used supported by GitHub themselves:
[![Actions Status](https://github.com/{owner}/{repo}/workflows/{workflow_name}/badge.svg)](https://github.com/{owner}/{repo}/actions)
rebar3 format
is the future nowadays and becoming the industry standard for formatting code. We should follow along and use rebar3 format
across the project.
Make use of a .git-blame-ignore-revs
file to ignore the formatter commit: https://michaelheap.com/git-ignore-rev/
The lambda API combined with Erlang is very useful (to me, at least!); it saves the hassle of provisioning/paying for an API Gateway when I want to invoke lambdas directly from Erlang using AWS creds/perms.
In order to avoid issues such as: aws-beam/aws-elixir#180 we need the capability to send send_body_as_binary | receive_body_as_binary
as part of the Options
. This way, if needed as for example in: aws_polly:synthesize_speech/3
such option can be passed.
Does anyone else seem to find the issue that when a request is made there are large amounts of binary data printed to stdout
. I imagine its not a problem when multiple requests are made sequentially based on the way the supervisor is configured.
I understand that its using hackney and this maybe the source of the verbose messaging. Is their a way or an option that I can pass to control the level of output given. I believe the most annoying output is the CA certs print out.
I looked at hakney options but I can find the right option to control this output. Looking at the code its seems if you use aws_<<svc>>:func/4
you can add an extra option at the end of it. This error is a problem for me with log files and some the application starts up.
I recently asked our local aws-erlang users if a file to S3 upload could avoid reading the file's contents first and instead stream it using something like Linux' sendfile. Alas the answer I got was that this project doesn't support that.
Hence this is a request to enable that feature.
This issue aims to solve AWS DynamoDB testing specifically as it is likely one of the most used modules of aws-erlang.
See the milestone for further details.
An aws_dynamodb_SUITE is created which can talk to a real AWS Account and perform certain AWS Requests towards it. A subset of the functionality should be tested to start with.
This issue aims to solve AWS Cloudwatch testing specifically as it is likely one of the most used modules of aws-erlang.
See the milestone for further details.
An aws_cloudwatch_SUITE is created which can talk to a real AWS Account and perform certain AWS Requests towards it. A subset of the functionality should be tested to start with.
This morning I found the following error and stack dump in our alarms:
<fun> failed type=exit reason:
{{body_decode_failed,
exit,
{fatal,
{expected_element_start_tag,
{file,
file_name_unknown},
{line,1},
{col,1}}},
503,<<>>},
[{xmerl_scan,
fatal,2,
[{file,
"xmerl_scan.erl"},
{line,
4127}]},
{xmerl_scan,
scan_document,
2,
[{file,
"xmerl_scan.erl"},
{line,
575}]},
{xmerl_scan,
string,2,
[{file,
"xmerl_scan.erl"},
{line,
294}]},
{aws_util,
decode_xml,
1,
[{file,
"/.../lib/aws/src/aws_util.erl"},
{line,
116}]},
{aws_s3,
handle_response,
3,
[{file,
"/.../lib/aws/src/aws_s3.erl"},
{line,
7944}]},
{aws_request,
do_request,
2,
[{file,
"/.../lib/aws/src/aws_request.erl"},
{line,
234}]},
{aws_s3,
upload_part,
5,
[{file,
"/.../lib/aws/src/aws_s3.erl"},
{line,
7542}]},
It seems that one aws_s3:upload_part/5
call got a 503
return with empty body from S3. aws_s3:handle_response/3
then tries to aws_util:decode_xml/1
that empty body (<<>>
) which raises an exit
. That exit
is caught and re-raised as an error
with some decoration in the reason.
The bug is that handle_response/3
is supposed to return {error, _, {503, _, _}}
in this case. For 5xx responses we can't assume the body is valid XML.
For instance when creating a user with tags you have to put the tags like so
aws_iam:create_user(Client, #{<<"UserName">> => <<"Name">>, <<"Tag.member.1.Key">> => <<"env">>, <<"Tag.member.1.Value">> => <<"development">>})
It would be nice if we could support adding the tags as a map like
<<"Tags">> => #{<<"Key1">> => <<"Value1">>}}
Or more closely follow the json structure in the api specification
<<"Tags">> => [#{<<"Key">> => <<"Key1">>,<<"Value">> => <<"Value1">>}]}
Providing an integer as a value in the body results in an error:
Observed behaviour
1> aws_sqs:receive_message(aws_client:make_client(), #{<<"QueueUrl">> => <<"https://sqs.eu-west-1.amazonaws.com/0123456789/some-queue">>, <<"VisibilityTimeout">> => 30}).
** exception error: bad argument
in function iolist_to_binary/1
called as iolist_to_binary({error,invalid_input,30})
in call from crypto:hash/2 (crypto.erl, line 373)
in call from aws_util:sha256_hexdigest/1 (/Users/juan.facorro/dev/aws-beam/aws_erlang/src/aws_util.erl, line 44)
in call from aws_request:add_content_hash_header/2 (/Users/juan.facorro/dev/aws-beam/aws_erlang/src/aws_request.erl, line 99)
in call from aws_request:sign_request/10 (/Users/juan.facorro/dev/aws-beam/aws_erlang/src/aws_request.erl, line 70)
in call from aws_sqs:request/4 (/Users/juan.facorro/dev/aws-beam/aws_erlang/src/aws_sqs.erl, line 753)
Expected behaviour
1> aws_sqs:receive_message(aws_client:make_client(), #{<<"QueueUrl">> => <<"https://sqs.eu-west-1.amazonaws.com/0123456789/some-queue">>, <<"VisibilityTimeout">> => 30}).
{ok,#{<<"ReceiveMessageResponse">> =>
#{<<"ReceiveMessageResult">> => none,
<<"ResponseMetadata">> =>
#{<<"RequestId">> =>
<<"47a69f91-b5c1-5f9d-b696-c8241773c26f">>}}},
{200,
[{<<"x-amzn-RequestId">>,
<<"47a69f91-b5c1-5f9d-b696-c8241773c26f">>},
{<<"Date">>,<<"Tue, 18 Aug 2020 09:33:38 GMT">>},
{<<"Content-Type">>,<<"text/xml">>},
{<<"Content-Length">>,<<"240">>}],
#Ref<0.1509001447.848297986.251305>}}
When making a Client, the service map entry gets set to undefined. This causes an error when making a request, but more specifically when aws_util:binary_join/3 gets hit in aws_util:credential_scope/3. Perhaps the aws_client interface should allow for the service to be set?
To reproduce...
Result:
** exception error: bad argument
in function bit_size/1
called as bit_size(undefined)
in call from aws_util:binary_join/3 (/aws-erlang/_build/default/lib/aws/src/aws_util.erl, line 51)
in call from aws_request:sign_request/10 (/aws-erlang/_build/default/lib/aws/src/aws_request.erl, line 38)
When providing multiple querystring parameters (e.g. to aws_s3:list_objects/8
) the response from AWS is the error The request signature we calculated does not match the signature you provided. Check your key and signing method.
.
Run the following with valid values for bucket
and prefix
:
aws_s3:list_objects(aws_client:make_client(), <<"bucket">>, <<"/">>, undefined, undefined, undefined, <<"prefix">>, undefined).
The result is an error specifying that The request signature we calculated does not match the signature you provided. Check your key and signing method.
.
The querystring parameters are sorted in reverse order. The request sends ?prefix=prefix&delimiter=%2F
and the expected querystring as reported by the error returned from AWS should be ?delimiter=%2F&prefix=prefix
.
aws_s3:head_object(Client, Bucket, Key, #{}).
exception error: no function clause matching
aws_s3:handle_response({ok,200, ...Headers...}, undefined, true)
When making a request with a client created through aws_client:make_temporary_client/4
the response is the following error:
{error,{<<"InvalidSignatureException">>,
<<"The request signature we calculated does not match the signature you provided. Check your AWS Secret"...>>},
{400,
[{<<"Server">>,<<"Server">>},
{<<"Date">>,<<"Thu, 16\nJul 2020 12:30:14 GMT">>},
{<<"Content-Type">>,<<"application/x-amz-json-1.1">>},
{<<"Content-Length">>,<<"229">>},
{<<"Connection">>,<<"keep-alive">>},
{<<"x-amzn-RequestId">>,
<<"0771a7ea-7aca-43e2-9b1e-312074724c5a">>}],
#Ref<0.1229173320.3980918785.234390>}}
$ rebar3 shell
1> application:ensure_all_started(aws).
2> Client = aws_client:make_temporary_client(<<"valid_id">>, <<"valid_secret">>, <<"valid_token">>, <<"eu-west-1">>).
3> aws_ssm:list_commands(Client, #{}).
{error,{<<"InvalidSignatureException">>,
<<"The request signature we calculated does not match the signature you provided. Check your AWS Secret"...>>},
{400,
[{<<"Server">>,<<"Server">>},
{<<"Date">>,<<"Thu, 16 Jul 2020 14:16:54 GMT">>},
{<<"Content-Type">>,<<"application/x-amz-json-1.1">>},
{<<"Content-Length">>,<<"229">>},
{<<"Connection">>,<<"keep-alive">>},
{<<"x-amzn-RequestId">>,
<<"b79951b9-cf8f-4ecf-b613-41b0a9fced0c">>}],
#Ref<0.1094149039.1313603590.42867>}}
The header X-Amz-Security-Token
is being added twice (once in aws_{service}:request/4
function and another in aws_request:sign_request/5
), but the signature is generated when it is there only once, so it becomes invalid.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.