Coder Social home page Coder Social logo

aws / aws-sdk-js-v3 Goto Github PK

View Code? Open in Web Editor NEW
2.9K 44.0 533.0 1.18 GB

Modularized AWS SDK for JavaScript.

License: Apache License 2.0

TypeScript 99.61% Java 0.25% JavaScript 0.11% Gherkin 0.02% Starlark 0.01% Objective-C 0.01% Ruby 0.01% Shell 0.01% Handlebars 0.01% Smithy 0.01% Makefile 0.01%
aws-sdk aws-sdk-js javascript typescript nodejs dynamodb lambda s3 sns

aws-sdk-js-v3's Introduction

AWS SDK for JavaScript v3

Build Status codecov code style: prettier

The AWS SDK for JavaScript v3 is a rewrite of v2 with some great new features. As with version 2, it enables you to easily work with Amazon Web Services, but has a modular architecture with a separate package for each service. It also includes many frequently requested features, such as a first-class TypeScript support and a new middleware stack. For more details, visit blog post on general availability of Modular AWS SDK for JavaScript.

To get started with JavaScript SDK version 3, visit our Developer Guide or API Reference.

If you are starting a new project with AWS SDK for JavaScript v3, then you can refer aws-sdk-js-notes-app which shows examples of calling multiple AWS Services in a note taking application. If you are migrating from v2 to v3, then you can visit our self-guided workshop which builds as basic version of note taking application using AWS SDK for JavaScript v2 and provides step-by-step migration instructions to v3.

To test your universal JavaScript code in Node.js, browser and react-native environments, visit our code samples repo.

Performance is crucial for the AWS SDK for JavaScript because it directly impacts the user experience. Please refer to Performance section to know more.

Table of Contents

  1. Getting Started
  2. New Features
    1. Modularized packages
    2. API consistency changes
      1. Configuration
      2. Middleware Stack
    3. How to upgrade
  3. High Level Concepts in V3
    1. Generated Packages
    2. Streams
    3. Paginators
    4. Abort Controller
    5. Middleware Stack
  4. Working with the SDK in Lambda
  5. Performance
  6. Install from Source
  7. Giving feedback and contributing
  8. Release Cadence
  9. Node.js versions
  10. Stability of Modular Packages
  11. Known Issues
    1. Functionality requiring AWS Common Runtime (CRT)

Getting Started

Let’s walk through setting up a project that depends on DynamoDB from the SDK and makes a simple service call. The following steps use yarn as an example. These steps assume you have Node.js and yarn already installed.

  1. Create a new Node.js project.

  2. Inside of the project, run: yarn add @aws-sdk/client-dynamodb. Adding packages results in update in lock file, yarn.lock or package-lock.json. You should commit your lock file along with your code to avoid potential breaking changes.

  3. Create a new file called index.js, create a DynamoDB service client and send a request.

const { DynamoDBClient, ListTablesCommand } = require("@aws-sdk/client-dynamodb");

(async () => {
  const client = new DynamoDBClient({ region: "us-west-2" });
  const command = new ListTablesCommand({});
  try {
    const results = await client.send(command);
    console.log(results.TableNames.join("\n"));
  } catch (err) {
    console.error(err);
  }
})();

If you want to use non-modular (v2-like) interfaces, you can import client with only the service name (e.g DynamoDB), and call the operation name directly from the client:

const { DynamoDB } = require("@aws-sdk/client-dynamodb");

(async () => {
  const client = new DynamoDB({ region: "us-west-2" });
  try {
    const results = await client.listTables({});
    console.log(results.TableNames.join("\n"));
  } catch (err) {
    console.error(err);
  }
})();

If you use tree shaking to reduce bundle size, using non-modular interface will increase the bundle size as compared to using modular interface.

If you are consuming modular AWS SDK for JavaScript on react-native environments, you will need to add and import following polyfills in your react-native application:

import "react-native-get-random-values";
import "react-native-url-polyfill/auto";
import { DynamoDB } from "@aws-sdk/client-dynamodb";

New features

Modularized packages

The SDK is now split up across multiple packages. The 2.x version of the SDK contained support for every service. This made it very easy to use multiple services in a project. Due to the limitations around reducing the size of the SDK when only using a handful of services or operations, many customers requested having separate packages for each service client. We have also split up the core parts of the SDK so that service clients only pull in what they need. For example, a service sends responses in JSON will no longer need to also have an XML parser as a dependency.

For those that were already importing services as sub-modules from the v2 SDK, the import statement doesn’t look too different. Here’s an example of importing the AWS Lambda service in v2 of the SDK, and the v3 SDK:

// import the Lambda client constructor in v2 of the SDK
const Lambda = require("aws-sdk/clients/lambda");

// import the Lambda client constructor in v3 SDK
const { Lambda } = require("@aws-sdk/client-lambda");

It is also possible to import both versions of the Lambda client by changing the variable name the Lambda constructor is stored in.

API changes

We’ve made several public API changes to improve consistency, make the SDK easier to use, and remove deprecated or confusing APIs. The following are some of the big changes included in the new AWS SDK for JavaScript v3.

Configuration

In version 2.x of the SDK, service configuration could be passed to individual client constructors. However, these configurations would first be merged automatically into a copy of the global SDK configuration: AWS.config.

Also, calling AWS.config.update({/* params */}) only updated configuration for service clients instantiated after the update call was made, not any existing clients.

This behavior was a frequent source of confusion, and made it difficult to add configuration to the global object that only affects a subset of service clients in a forward-compatible way. In v3, there is no longer a global configuration managed by the SDK. Configuration must be passed to each service client that is instantiated. It is still possible to share the same configuration across multiple clients but that configuration will not be automatically merged with a global state.

Middleware

Version 2.x of the SDK allows modifying a request throughout multiple stages of a request’s lifecycle by attaching event listeners to a request. Some feedback we received frequently was that it can be difficult to debug what went wrong during a request’s lifecycle. We’ve switched to using a middleware stack to control the lifecycle of an operation call now. This gives us a few benefits. Each middleware in the stack calls the next middleware after making any changes to the request object. This also makes debugging issues in the stack much easier since you can see exactly which middleware have been called leading up to an error. Here’s an example of logging requests using middleware:

const client = new DynamoDB({ region: "us-west-2" });

client.middlewareStack.add(
  (next, context) => async (args) => {
    console.log("AWS SDK context", context.clientName, context.commandName);
    console.log("AWS SDK request input", args.input);
    const result = await next(args);
    console.log("AWS SDK request output:", result.output);
    return result;
  },
  {
    name: "MyMiddleware",
    step: "build",
    override: true,
  }
);

await client.listTables({});

In the above example, we’re adding a middleware to our DynamoDB client’s middleware stack. The first argument is a function that accepts next, the next middleware in the stack to call, and context, an object that contains some information about the operation being called. It returns a function that accepts args, an object that contains the parameters passed to the operation and the request, and returns the result from calling the next middleware with args.

Other Changes

If you are looking for a breakdown of the API changes from AWS SDK for JavaScript v2 to v3, we have them listed in UPGRADING.md.

Working with the SDK in Lambda

General Info

The Lambda provided AWS SDK is set to a specific minor version, and NOT the latest version. To check the minor version used by Lambda, please refer to Lambda runtimes doc page. If you wish to use the latest / different version of the SDK from the one provided by lambda, we recommend that you bundle and minify your project, or upload it as a Lambda layer.

The performance of the AWS SDK for JavaScript v3 on node 18 has improved from v2 as seen in the performance benchmarking

Best practices

When using Lambda we should use a single SDK client per service, per region, and initialize it outside of the handler's codepath. This is done to optimize for Lambda's container reuse.

The API calls themselves should be made from within the handler's codepath. This is done to ensure that API calls are signed at the very last step of Lambda's execution cycle, after the Lambda is "hot" to avoid signing time skew.

Example:

import { STSClient, GetCallerIdentityCommand } from "@aws-sdk/client-sts";

const client = new STSClient({}); // SDK Client initialized outside the handler

export const handler = async (event) => {
  const response = {
    statusCode: 200,
    headers: {
      "Content-Type": "application/json",
    },
  };

  try {
    const results = await client.send(new GetCallerIdentityCommand({})); // API operation made from within the handler
    const responseBody = {
      userId: results.UserId,
    };

    response.body = JSON.stringify(responseBody);
  } catch (err) {
    console.log("Error:", err);
    response.statusCode = 500;
    response.body = JSON.stringify({
      message: "Internal Server Error",
    });
  }

  return response;
};

Performance

Please refer to supplemental docs on performance to know more.

Install from Source

All clients have been published to NPM and can be installed as described above. If you want to play with latest clients, you can build from source as follows:

  1. Clone this repository to local by:

    git clone https://github.com/aws/aws-sdk-js-v3.git
    
  2. Under the repository root directory, run following command to link and build the whole library, the process may take several minutes:

    yarn && yarn test:all
    

    For more information, please refer to contributing guide.

  3. After the repository is successfully built, change directory to the client that you want to install, for example:

    cd clients/client-dynamodb
    
  4. Pack the client:

    yarn pack .
    

    yarn pack will create an archive file in the client package folder, e.g. aws-sdk-client-dynamodb-v3.0.0.tgz.

  5. Change directory to the project you are working on and move the archive to the location to store the vendor packages:

    mv path/to/aws-sdk-js-v3/clients/client-dynamodb/aws-sdk-client-dynamodb-v3.0.0.tgz ./path/to/vendors/folder
    
  6. Install the package to your project:

    yarn add ./path/to/vendors/folder/aws-sdk-client-dynamodb-v3.0.0.tgz
    

Giving feedback and contributing

You can provide feedback to us in several ways. Both positive and negative feedback is appreciated. If you do, please feel free to open an issue on our GitHub repository. Our GitHub issues page also includes work we know still needs to be done to reach full feature parity with v2 SDK.

Feedback

GitHub issues. Customers who are comfortable giving public feedback can open a GitHub issue in the new repository. This is the preferred mechanism to give feedback so that other customers can engage in the conversation, +1 issues, etc. Issues you open will be evaluated, and included in our roadmap for the GA launch.

Gitter channel. For informal discussion or general feedback, you may join the Gitter chat. The Gitter channel is also a great place to get help with v3 from other developers. JS SDK team doesn't track the discussion daily, so feel free to open a GitHub issue if your question is not answered there.

Contributing

You can open pull requests for fixes or additions to the new AWS SDK for JavaScript v3. All pull requests must be submitted under the Apache 2.0 license and will be reviewed by an SDK team member prior to merging. Accompanying unit tests are appreciated. See Contributing for more information.

High Level Concepts

This is an introduction to some of the high level concepts behind AWS SDK for JavaScript (v3) which are shared between services and might make your life easier. Please consult the user guide and API reference for service specific details.

Terminology:

Bare-bones clients/commands: This refers to a modular way of consuming individual operations on JS SDK clients. It results in less code being imported and thus more performant. It is otherwise equivalent to the aggregated clients/commands.

// this imports a bare-bones version of S3 that exposes the .send operation
import { S3Client } from "@aws-sdk/client-s3"

// this imports just the getObject operation from S3
import { GetObjectCommand } from "@aws-sdk/client-s3"

//usage
const bareBonesS3 = new S3Client({...});
await bareBonesS3.send(new GetObjectCommand({...}));

Aggregated clients/commands: This refers to a way of consuming clients that contain all operations on them. Under the hood this calls the bare-bones commands. This imports all commands on a particular client and results in more code being imported and thus less performant. This is 1:1 with v2's style.

// this imports an aggregated version of S3 that exposes the .send operation
import { S3 } from "@aws-sdk/client-s3"

// No need to import an operation as all operations are already on the S3 prototype

//usage
const aggregatedS3 = new S3({...});
await aggregatedS3.getObject({...}));

Generated Code

The v3 codebase is generated from internal AWS models that AWS services expose. We use smithy-typescript to generate all code in the /clients subdirectory. These packages always have a prefix of @aws-sdk/client-XXXX and are one-to-one with AWS services and service operations. You should be importing @aws-sdk/client-XXXX for most usage.

Clients depend on common "utility" code in /packages. The code in /packages is manually written and outside of special cases (like credentials or abort controller) is generally not very useful alone.

Lastly, we have higher level libraries in /lib. These are javascript specific libraries that wrap client operations to make them easier to work with. Popular examples are @aws-sdk/lib-dynamodb which simplifies working with items in Amazon DynamoDB or @aws-sdk/lib-storage which exposes the Upload function and simplifies parallel uploads in S3's multipartUpload.

  1. /packages. This sub directory is where most manual code updates are done. These are published to NPM under @aws-sdk/XXXX and have no special prefix.
  2. /clients. This sub directory is code generated and depends on code published from /packages . It is 1:1 with AWS services and operations. Manual edits should generally not occur here. These are published to NPM under @aws-sdk/client-XXXX.
  3. /lib. This sub directory depends on generated code published from /clients. It wraps existing AWS services and operations to make them easier to work with in Javascript. These are published to NPM under @aws-sdk/lib-XXXX

Streams

Certain command outputs include streams, which have different implementations in Node.js and browsers. For convenience, a set of stream handling methods will be merged (Object.assign) to the output stream object, as defined in SdkStreamMixin.

Output types having this feature will be indicated by the WithSdkStreamMixin<T, StreamKey> wrapper type, where T is the original output type and StreamKey is the output property key having a stream type specific to the runtime environment.

Here is an example using S3::GetObject.

import { S3 } from "@aws-sdk/client-s3";

const client = new S3({});

const getObjectResult = await client.getObject({
  Bucket: "...",
  Key: "...",
});

// env-specific stream with added mixin methods.
const bodyStream = getObjectResult.Body;

// one-time transform.
const bodyAsString = await bodyStream.transformToString();

// throws an error on 2nd call, stream cannot be rewound.
const __error__ = await bodyStream.transformToString();

Note that these methods will read the stream in order to collect it, so you must save the output. The methods cannot be called more than once on a stream.

Paginators

Many AWS operations return paginated results when the response object is too large to return in a single response. In AWS SDK for JavaScript v2, the response contains a token you can use to retrieve the next page of results. You then need to write additional functions to process pages of results.

In AWS SDK for JavaScript v3, we’ve improved pagination using async generator functions, which are similar to generator functions, with the following differences:

  • When called, async generator functions return an object, an async generator whose methods (next, throw, and return) return promises for { value, done }, instead of directly returning { value, done }. This automatically makes the returned async generator objects async iterators.
  • await expressions and for await (x of y) statements are allowed.
  • The behavior of yield* is modified to support delegation to async iterables.

The Async Iterators were added in the ES2018 iteration of JavaScript. They are supported by Node.js 10.x+ and by all modern browsers, including Chrome 63+, Firefox 57+, Safari 11.1+, and Edge 79+. If you’re using TypeScript v2.3+, you can compile Async Iterators to older versions of JavaScript.

An async iterator is much like an iterator, except that its next() method returns a promise for a { value, done } pair. As an implicit aspect of the Async Iteration protocol, the next promise is not requested until the previous one resolves. This is a simple, yet a very powerful pattern.

Example Pagination Usage

In v3, the clients expose paginateOperationName APIs that are written using async generators, allowing you to use async iterators in a for await..of loop. You can perform the paginateListTables operation from @aws-sdk/client-dynamodb as follows:

const {
  DynamoDBClient,
  paginateListTables,
} = require("@aws-sdk/client-dynamodb");

...
const paginatorConfig = {
  client: new DynamoDBClient({}),
  pageSize: 25
};
const commandParams = {};
const paginator = paginateListTables(paginatorConfig, commandParams);

const tableNames = [];
for await (const page of paginator) {
  // page contains a single paginated output.
  tableNames.push(...page.TableNames);
}
...

Or simplified:

...
const client = new DynamoDBClient({});

const tableNames = [];
for await (const page of paginateListTables({ client }, {})) {
    // page contains a single paginated output.
    tableNames.push(...page.TableNames);
}
...

Abort Controller

In v3, we support the AbortController interface which allows you to abort requests as and when desired.

The AbortController Interface provides an abort() method that toggles the state of a corresponding AbortSignal object. Most APIs accept an AbortSignal object, and respond to abort() by rejecting any unsettled promise with an “AbortError”.

// Returns a new controller whose signal is set to a newly created AbortSignal object.
const controller = new AbortController();

// Returns the AbortSignal object associated with controller.
const signal = controller.signal;

// Invoking this method will set controller’s AbortSignal's aborted flag
// and signal to any observers that the associated activity is to be aborted.
controller.abort();

AbortController Usage

In JavaScript SDK v3, we added an implementation of WHATWG AbortController interface in @aws-sdk/abort-controller. To use it, you need to send AbortController.signal as abortSignal in the httpOptions parameter when calling .send() operation on the client as follows:

const { AbortController } = require("@aws-sdk/abort-controller");
const { S3Client, CreateBucketCommand } = require("@aws-sdk/client-s3");

...

const abortController = new AbortController();
const client = new S3Client(clientParams);

const requestPromise = client.send(new CreateBucketCommand(commandParams), {
  abortSignal: abortController.signal,
});

// The abortController can be aborted any time.
// The request will not be created if abortSignal is already aborted.
// The request will be destroyed if abortSignal is aborted before response is returned.
abortController.abort();

// This will fail with "AbortError" as abortSignal is aborted.
await requestPromise;

For a full pagination deep dive, please check out our blog post.

AbortController Example

The following code snippet shows how to upload a file using S3's putObject API in the browser with support to abort the upload. First, create a controller using the AbortController() constructor, then grab a reference to its associated AbortSignal object using the AbortController.signal property. When the PutObjectCommand is called with .send() operation, pass in AbortController.signal as abortSignal in the httpOptions parameter. This will allow you to abort the PutObject operation by calling abortController.abort().

const abortController = new AbortController();
const abortSignal = abortController.signal;

const uploadBtn = document.querySelector('.upload');
const abortBtn = document.querySelector('.abort');

uploadBtn.addEventListener('click', uploadObject);

abortBtn.addEventListener('click', function() {
  abortController.abort();
  console.log('Upload aborted');
});

const uploadObject = async (file) => {
  ...
  const client = new S3Client(clientParams);
  try {
    await client.send(new PutObjectCommand(commandParams), { abortSignal });
  } catch(e) {
    if (e.name === "AbortError") {
      uploadProgress.textContent = 'Upload aborted: ' + e.message;
    }
    ...
  }
}

For a full abort controller deep dive, please check out our blog post.

Middleware Stack

The AWS SDK for JavaScript (v3) maintains a series of asynchronous actions. These series include actions that serialize input parameters into the data over the wire and deserialize response data into JavaScript objects. Such actions are implemented using functions called middleware and executed in a specific order. The object that hosts all the middleware including the ordering information is called a Middleware Stack. You can add your custom actions to the SDK and/or remove the default ones.

When an API call is made, SDK sorts the middleware according to the step it belongs to and its priority within each step. The input parameters pass through each middleware. An HTTP request gets created and updated along the process. The HTTP Handler sends a request to the service, and receives a response. A response object is passed back through the same middleware stack in reverse, and is deserialized into a JavaScript object.

A middleware is a higher-order function that transfers user input and/or HTTP request, then delegates to “next” middleware. It also transfers the result from “next” middleware. A middleware function also has access to context parameter, which optionally contains data to be shared across middleware.

For example, you can use middleware to log or modify a request:

const { S3 } = require("@aws-sdk/client-s3");
const client = new S3({ region: "us-west-2" });

// Middleware added to client, applies to all commands.
client.middlewareStack.add(
  (next, context) => async (args) => {
    args.request.headers["x-amz-meta-foo"] = "bar";
    console.log("AWS SDK context", context.clientName, context.commandName);
    console.log("AWS SDK request input", args.input);
    const result = await next(args);
    console.log("AWS SDK request output:", result.output);
    return result;
  },
  {
    step: "build",
    name: "addFooMetadataMiddleware",
    tags: ["METADATA", "FOO"],
    override: true,
  }
);

await client.putObject(params);

Specifying the absolute location of your middleware The example above adds middleware to build step of middleware stack. The middleware stack contains five steps to manage a request’s lifecycle:

  • The initialize lifecycle step initializes an API call. This step typically adds default input values to a command. The HTTP request has not yet been constructed.
  • The serialize lifecycle step constructs an HTTP request for the API call. Example of typical serialization tasks include input validation and building an HTTP request from user input. The downstream middleware will have access to serialized HTTP request object in callback’s parameter args.request.
  • The build lifecycle step builds on top of serialized HTTP request. Examples of typical build tasks include injecting HTTP headers that describe a stable aspect of the request, such as Content-Length or a body checksum. Any request alterations will be applied to all retries.
  • The finalizeRequest lifecycle step prepares the request to be sent over the wire. The request in this stage is semantically complete and should therefore only be altered to match the recipient’s expectations. Examples of typical finalization tasks include request signing, performing retries and injecting hop-by-hop headers.
  • The deserialize lifecycle step deserializes the raw response object to a structured response. The upstream middleware have access to deserialized data in next callbacks return value: result.output. Each middleware must be added to a specific step. By default, each middleware in the same step has undifferentiated order. In some cases, you might want to execute a middleware before or after another middleware in the same step. You can achieve it by specifying its priority.
client.middlewareStack.add(middleware, {
  name: "MyMiddleware",
  step: "initialize",
  priority: "high", // or "low".
  override: true, // provide both a name and override=true to avoid accidental middleware duplication.
});

For a full middleware stack deep dive, please check out our blog post.

Release Cadence

Our releases usually happen once per weekday. Each release increments the minor version, e.g. 3.200.0 -> 3.201.0.

Node.js versions

v3.201.0 and higher requires Node.js >= 14.

v3.46.0 to v3.200.0 requires Node.js >= 12.

Earlier versions require Node.js >= 10.

Stability of Modular Packages

Package name containing folder API controlled by stability
@aws-sdk/client-* Commands clients AWS service teams public/stable
@aws-sdk/client-* Clients clients AWS SDK JS team public/stable
@aws-sdk/lib-* lib AWS SDK JS team public/stable
@aws-sdk/*-signer packages AWS SDK JS team public/stable
@aws-sdk/middleware-stack packages AWS SDK JS team public/stable
remaining @aws-sdk/* packages AWS SDK JS team internal

Public interfaces are marked with the @public annotation in source code and appear in our API Reference.

Additional notes:

  • @internal does not mean a package or interface is constantly changing or being actively worked on. It means it is subject to change without any notice period. The changes are included in the release notes.
  • public interfaces such as client configuration are also subject to change in exceptional cases. We will try to undergo a deprecation period with an advance notice.

All supported interfaces are provided at the package level, e.g.:

import { S3Client } from "@aws-sdk/client-s3"; // Yes, do this.

import { S3Client } from "@aws-sdk/client-s3/dist-cjs/S3Client"; // No, don't do this.

Do not import from a deep path in any package, since the file structure may change, and in the future packages may include the exports metadata in package.json preventing access to the file structure.

Known Issues

Functionality requiring AWS Common Runtime (CRT)

This SDK has optional functionality that requires the AWS Common Runtime (CRT) bindings to be included as a dependency with your application. This functionality includes:

If the required AWS Common Runtime components are not installed, you will receive an error like:

Cannot find module '@aws-sdk/signature-v4-crt'
...
Please check whether you have installed the "@aws-sdk/signature-v4-crt" package explicitly.
You must also register the package by calling [require("@aws-sdk/signature-v4-crt");]
or an ESM equivalent such as [import "@aws-sdk/signature-v4-crt";].
For more information please go to
https://github.com/aws/aws-sdk-js-v3#functionality-requiring-aws-common-runtime-crt"

indicating that the required dependency is missing to use the associated functionality. To install this dependency, follow the provided instructions.

Installing the AWS Common Runtime (CRT) Dependency

You can install the CRT dependency with different commands depending on the package management tool you are using. If you are using NPM:

npm install @aws-sdk/signature-v4-crt

If you are using Yarn:

yarn add @aws-sdk/signature-v4-crt

Additionally, load the signature-v4-crt package by importing it.

require("@aws-sdk/signature-v4-crt");
// or ESM
import "@aws-sdk/signature-v4-crt";

Only the import statement is needed. The implementation then registers itself with @aws-sdk/signature-v4-multi-region and becomes available for its use. You do not need to use any imported objects directly.

Related issues

  1. S3 Multi-Region Access Point(MRAP) is not available unless with additional dependency

aws-sdk-js-v3's People

Contributors

adamthom-amzn avatar ajredniwja avatar alexforsyth avatar allanzhengyp avatar andrewfossaws avatar chrisradek avatar christophgysin avatar dependabot-preview[bot] avatar dependabot[bot] avatar eduardomourar avatar gosar avatar greenkeeper[bot] avatar hpmellema avatar jeskew avatar jordonphillips avatar kellertk avatar kstich avatar kuhe avatar mhassan1 avatar milesziemer avatar mtdowling avatar myoung25 avatar ranvaknin avatar seebees avatar siddsriv avatar simonbuchan avatar srchase avatar syall avatar trivikr avatar tysonandre avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aws-sdk-js-v3's Issues

S3: Simpler Multipart Uploads in V3

Is your feature request related to a problem? Please describe.

I might have not been looking in the right place, but I can't find a method matching s3.upload (https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#upload-property) in v3. Specifically something that handles all the stages of multipart uploads (initiate multipart, upload part, complete multipart).

Describe the solution you'd like

A method similar to the v2 s3.upload which does all the heavy lifting for multipart uploads using something like the managed upload module (https://github.com/aws/aws-sdk-js/blob/master/lib/s3/managed_upload.js).

Describe alternatives you've considered

  • PUT Object Command (not suitable for multipart I believe and the response does not match that of the complete multipart command.
  • Manual multipart (initiate, upload parts, complete) which can be cumbersome and error prone.

Additional context

N/A

Thanks!

An in-range update of lint-staged is breaking the build 🚨

The devDependency lint-staged was updated from 8.2.0 to 8.2.1.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

lint-staged is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build passed (Details).
  • codecov/project: 90.46% (-0.93%) compared to 3f00b99 (Details).
  • codecov/patch: Coverage not affected when comparing 3f00b99...9318cdf (Details).

Release Notes for v8.2.1

8.2.1 (2019-06-13)

Bug Fixes

  • Override env GIT_DIR variable to resolve to the correct git dir path (#629) (5892455), closes #627
Commits

The new version differs by 2 commits.

  • 5892455 fix: Override env GIT_DIR variable to resolve to the correct git dir path (#629)
  • bffef73 chore: Fix tests on Windows (#604)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

DynamoDB: Putting item that contains multi-byte characters causes error

I tried to put an item that contains multi-byte characters and it produces an error.

Could you please help me?

https://github.com/cockscomb/aws-sdk-js-v3-test/blob/51d3a2416e2ed4859574457e080dd15a682929ae/src/dynamodb-multibyte.test.ts#L51-L71

    InternalFailure: The request processing has failed because of an unknown error, exception or failure.

      at JsonRpcParser.Object.<anonymous>.exports.jsonErrorUnmarshaller [as parseServiceException] (node_modules/@aws-sdk/json-error-unmarshaller/src/index.ts:65:33)
      at JsonRpcParser.<anonymous> (node_modules/@aws-sdk/protocol-json-rpc/src/JsonRpcParser.ts:33:24)
      at step (node_modules/tslib/tslib.js:133:27)
      at Object.next (node_modules/tslib/tslib.js:114:57)
      at fulfilled (node_modules/tslib/tslib.js:104:62)
Test result ``` $ docker-compose run --rm node npm test Creating network "aws-sdk-js-v3-test_default" with the default driver Creating aws-sdk-js-v3-test_dynamodb_1 ... done

[email protected] test /app
jest

FAIL src/dynamodb-multibyte.test.ts (9.712s)
DynamoDBClient
✓ create table (603ms)
✓ put item (76ms)
✓ get item (39ms)
✕ put item contains multi-byte characters (599ms)
✕ get item contains multi-byte characters (51ms)

● DynamoDBClient › put item contains multi-byte characters

InternalFailure: The request processing has failed because of an unknown error, exception or failure.

  at JsonRpcParser.Object.<anonymous>.exports.jsonErrorUnmarshaller [as parseServiceException] (node_modules/@aws-sdk/json-error-unmarshaller/src/index.ts:65:33)
  at JsonRpcParser.<anonymous> (node_modules/@aws-sdk/protocol-json-rpc/src/JsonRpcParser.ts:33:24)
  at step (node_modules/tslib/tslib.js:133:27)
  at Object.next (node_modules/tslib/tslib.js:114:57)
  at fulfilled (node_modules/tslib/tslib.js:104:62)

● DynamoDBClient › get item contains multi-byte characters

TypeError: Cannot read property 'Title' of undefined

  68 |         });
  69 |         const result = await dynamoDB.send(getItem);
> 70 |         expect(result.Item!.Title.S).toEqual('これはitem2。')
     |                             ^
  71 |     });
  72 | });
  73 |

  at Object.it (src/dynamodb-multibyte.test.ts:70:29)

Test Suites: 1 failed, 1 total
Tests: 2 failed, 3 passed, 5 total
Snapshots: 0 total
Time: 13.131s
Ran all test suites.
npm ERR! Test failed. See above for more details.

</details>

S3 client incorrectly sign requests with special object key

Describe the bug
S3 client incorrectly sign requests with special object key

Additional context

    const client = new S3Client({region: 'us-west-2'});
    const command = new PutObjectCommand({
        Key: 'key?:colon',
        Bucket: 'bucket',
        Body: 'body'
    });
    try {
        const data = await client.send(command);
        console.log(data);
    } catch(e) {
        console.log(e)
    }

response:

{ SignatureDoesNotMatch: The request signature we calculated does not match the signature you provided. Check your key and signing method.
    at RestParser.exports.s3ErrorUnmarshaller [as parseServiceException] (/Users/zheallan/workspace/aws-sdk-js-v3/packages/s3-error-unmarshaller/build/index.js:46:58)
    at /Users/zheallan/workspace/aws-sdk-js-v3/packages/protocol-rest/build/RestParser.js:27:29
    at process._tickCallback (internal/process/next_tick.js:68:7)
  name: 'SignatureDoesNotMatch',
  message:
   'The request signature we calculated does not match the signature you provided. Check your key and signing method.',
  details: {},
  '$metadata':
   { httpHeaders:
      { 'x-amz-request-id': 'F3E9A251974AF6C4',
        'x-amz-id-2':
         '//lL4YKgVhUwD/29upCQcqyniyrDAJSuTyUrnjjUQKeSmVcB2M3SzCDIyBJxRBPpGSQfJN1/oT4=',
        'content-type': 'application/xml',
        'transfer-encoding': 'chunked',
        date: 'Fri, 28 Jun 2019 18:30:33 GMT',
        connection: 'close',
        server: 'AmazonS3' },
     httpStatusCode: 403,
     requestId: 'F3E9A251974AF6C4',
     extendedRequestId:
      '//lL4YKgVhUwD/29upCQcqyniyrDAJSuTyUrnjjUQKeSmVcB2M3SzCDIyBJxRBPpGSQfJN1/oT4=',
     cfId: undefined,
     retries: 0,
     totalRetryDelay: 0 } }

suggestion: Refactoring library structure

It will be benefitial to separate the generated client packages from the packages folder. We can have all the service client packages in the packages folder and put all the hand-written packages in the lib folder. This way, we have a clear indication of what packages are open to PRs and what packages you should only expect to be generated. Lerna supports multiple root folders and we should do this before we commit all the clients, so that published packages won't bump up version just because we refactored the library structure.

Release scripts should update CodeGen code (if necessary)

Describe the bug

To Reproduce (observed behavior)
Run release scripts

Expected behavior
If release scripts are updating some package version which affects CodeGen, related code change should also be made.

Usability with _AttributeValue type

In the old v2 api, I could do something like this:

    client.get({ TableName, Key: { id } }, (err, data) => {
      if (err) {
        reject(err)
      } else {
        resolve(data.Item as any)
      }
    })

Now I have to do the following:

    const get = new GetItemCommand({ TableName, Key: { id: {S: id} } })
    client.send(get, (err, data) => {
      if (err) {
        reject(err)
      } else {
        resolve(data.Item as any)
      }
    })

Is the S: id really necessary? That would mean I have to convert all my objects.
Am I incorrectly calling a too low level API?
This seems way more complicated than it should be.

I got it from here: https://github.com/aws/aws-sdk-js-v3/tree/master/packages/client-dynamodb-v2-node

Implement SourceFile and SaveAs parameters in S3 API

In the PHP SDK, two parameters SourceFile and SaveAs exist that do not exist in the JavaScript SDK. As the documentation states:

The SDK also provides a shortcut for uploading directly from a file using the SourceFile parameter, instead of the Body parameter.

The SourceFile and SaveAs parameters allow you to use the SDK to directly upload files to and download files from S3 very easily.

It's true that these two parameters make uploading and downloading files much easier. For API consistency across platforms, could we get SourceFile and SaveAs added into the JavaScript API please?

@aws-sdk/client-documentation-generator is not published to npm

Describe the bug
@aws-sdk/client-documentation-generator is not published to npm.

https://github.com/aws/aws-sdk-js-v3/tree/master/packages/client-documentation-generator

@aws-sdk/client-documentation-generator is a dependency of the packages that are created by @aws-sdk/package-generator, which is published.

I'd suggest removing package-generator, or removing client-documentation-generator from the generated dependencies, or publishing client-documentation-generator.

Create an S3 presigned url package

The presign function in SigV4 package only supports presigning the httpRequest object. It would be helpful if S3 presigned url can be generated by given operation and parameters.

suggestion: generate interfaces for services

Currently, services are exported as classes. Generating interfaces and exporting them would make it easier to switch out implementations. Then the library consumer could depend on the interface not the implementation.

class S3 implements IS3 {

Examples what someone might want to implement behind an S3 interface:

  • In-Memory implementation for much faster integration tests
  • Local filesystem implementation for sensitive data
  • Exposing files from remote websites as virtual read only objects in bucket
  • Exposing "computed" objects which are generated on request
  • Combining multiple implementations and "routing" based on bucket name

Some of those could be implemented at the HTTP level or via the provided Middleware functionality, but some make only sense implemented directly as an implementation of an TypeScript interface.

DynamoDB client does not parse request id from header

The response is suppose to have requestId value that contains the value from x-amzn-requestid header. We need to add another statement to extract request id from the header above other than x-amz-requestid. A small update here may fix the issue.

CognitoIdentityClient requires 'credentials' in its constructor

I am trying to construct a CognitoIdentityClient to get AWS Credentials for unauthenticated users.
Here is the code:

import { CognitoIdentityClient } from '@aws-sdk/client-cognito-identity-browser/CognitoIdentityClient'
const cognitoIdentityClient = new CognitoIdentityClient({
  region: 'us-wes'
})

TS throws a the compile error Properties 'credentials' is missing in type...
credentials should not be required to create a CognitoIdentityClient since it is the thing used to get credentials.

dynamo scan json unmarshalling error

seeing this when i try to do a dynamo scan

{ TypeError: Cannot read property 'type' of undefined
    at JsonParser.unmarshall (/Users/chrisyoung/Desktop/prod/node_modules/@aws-sdk/json-parser/build/index.js:13:19)
    at /Users/chrisyoung/Desktop/prod/node_modules/@aws-sdk/json-parser/build/index.js:41:37
    at Array.reduce (<anonymous>)
    at JsonParser.unmarshall (/Users/chrisyoung/Desktop/prod/node_modules/@aws-sdk/json-parser/build/index.js:40:18)
    at /Users/chrisyoung/Desktop/prod/node_modules/@aws-sdk/json-parser/build/index.js:22:48
    at Array.reduce (<anonymous>)
    at JsonParser.unmarshall (/Users/chrisyoung/Desktop/prod/node_modules/@aws-sdk/json-parser/build/index.js:18:18)
    at /Users/chrisyoung/Desktop/prod/node_modules/@aws-sdk/json-parser/build/index.js:41:37
    at Array.reduce (<anonymous>)
    at JsonParser.unmarshall (/Users/chrisyoung/Desktop/prod/node_modules/@aws-sdk/json-parser/build/index.js:40:18) '$metadata': { retries: 0, totalRetryDelay: 0 } }

this code reproduces the error

const { DynamoDB } = require('@aws-sdk/client-dynamodb-v2-node');

(async function main() {

  const db = new DynamoDB({ region: 'us-east-1' });

  try {
    const params = { TableName: 'foo' };
    const res = await db.scan(params);

    console.log(res);
  } catch (err) {
    console.error(err);
  }

})();

will try to take a deeper look into whats causing the issue tomorrow

What happened to DynamoDB documentClient

This is more of a question than a feature request. I am using/previewing the new AWS SDK for JavaScript - especially the DynamoDB component. I am using it successfully except for one thing - I cannot find a trace of DocumentClient as found in v2

Is this still to come or has been replaced by something else?

Remove *.d.ts in karma config

Per current karma config, smoke teset manages to find the *.ts files and preprocess them using karma-typescript. In some circumstances it will try to find *.d.ts that not yet generated. This may leads to 'resource cannot be found' error. We need to exclude these file in karma.config

DynamoDB: Buffer value not always accepted for binary (B) attributes

A DynamoDB field attribute of type B (binary) is defined in _AttributeValue like this: B?: ArrayBuffer | ArrayBufferView | string;

Using putItem I can easily write a field attribute with a Buffer value, which is presumably later automatically converted to a Base64 representation for transmission over the network, but it looks like this is not always possible.

I tried applying a Buffer value to ExpressionAttributeValues in an UpdateItemInput object, but only Base64 encoded string works here. This doesn't throw an exception directly, but the condition will not be met, i.e. the resulting error is: ConditionalCheckFailedException: The conditional request failed.

Example:

const code = Buffer.from('aabbccddeeff', 'hex');

const args: UpdateItemInput = {
	TableName: 'myTableName',
	Key: {
		email: { S: 'xxx@xxx' },
	},
	ExpressionAttributeNames: {
		'#f': 'TestField',
	},
	ExpressionAttributeValues: {
		// ':c': { B: code }, // This should work, but doesn't
		// ':c': { B: code.buffer }, // This should work, but doesn't
		':c': { B: code.toString('base64') }, // Only Base64 representation works
	},
	ConditionExpression: '#f = :c',
	UpdateExpression: 'REMOVE #f',
};

client.updateItem(args);

Used SDK version: 0.1.0-preview.1

An in-range update of commitlint is breaking the build 🚨

There have been updates to the commitlint monorepo:

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

This monorepo update includes releases of one or more dependencies which all belong to the commitlint group definition.

commitlint is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Commits

The new version differs by 39 commits.

  • c17420d v8.1.0
  • ca19d70 chore: update dependency lodash to v4.17.14 (#724)
  • 5757ef2 build(deps): bump lodash.template from 4.4.0 to 4.5.0 (#721)
  • 5b5f855 build(deps): bump lodash.merge from 4.6.0 to 4.6.2 (#722)
  • 4cb979d build(deps): bump lodash from 4.17.11 to 4.17.13 (#723)
  • a89c1ba chore: add devcontainer setup
  • 9aa5709 chore: pin dependencies (#714)
  • c9ef5e2 chore: centralize typescript and jest setups (#710)
  • c9dcf1a chore: pin dependencies (#708)
  • 6a6a8b0 refactor: rewrite top level to typescript (#679)
  • 0fedbc0 chore: update dependency @types/jest to v24.0.15 (#694)
  • 0b9c7ed chore: update dependency typescript to v3.5.2 (#695)
  • 4efb34b chore: update dependency globby to v10 (#705)
  • 804af8b chore: update dependency lint-staged to v8.2.1 (#696)
  • 9075844 fix: add explicit dependency on chalk (#687)

There are 39 commits in total.

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Signature mismatch for execute api service

Describe the bug
Getting signature mismatch when using generated signature as bellow

SDK version number
0.1.0-preview.2

Is the issue in the browser/Node.js?
Node.js

Details of the browser/Node.js version
v8.9.4

To Reproduce (observed behavior)

export const signRequest = async (request) => {
    Logger.info('Generating signed request');
    const signature = new SignatureV4({
        service: 'execute-api',
        region: 'us-east-1',
        credentials: {
            accessKeyId: getEnv(ENV.CREDENTIALS.ACCESS_KEY),
            secretAccessKey: getEnv(ENV.CREDENTIALS.SECRET_KEY),
            sessionToken: getEnv(ENV.CREDENTIALS.SESSION_TOKEN)
        },
        applyChecksum: false
    });
    const result = await signature.sign(request);
    Logger.debug(`Signature is ${JSON.stringify(result)}`);
    return result;
};

Request is

             {
                    method: "POST",
                    protocol: "https:",
                    path: url.pathname,
                    hostname: url.host,
                    headers: {
                        "X-Amz-Content-Sha256": "UNSIGNED-PAYLOAD",
                        'x-api-key': NBA.API_KEY,
                        'Content-Type': 'application/json'
                    },
                    body: nbaReq
                }

Expected behavior
should generate signature headers from the given request.

Error Message
The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method. Consult the service documentation for details.\n\nThe Canonical String for this request should have been\n .....

Cannot find module '@aws-sdk/client-lambda-node/InvokeAsyncCommand'

What's the right command for invoking a Lambda function - I was trying something like this:

const { LambdaClient } = require('@aws-sdk/client-lambda-node/LambdaClient');
const { InvokeAsyncCommand } = require('@aws-sdk/client-lambda-node/InvokeAsyncCommand');

var lambda = new LambdaClient({region: 'eu-central-1'});
var params = {
  FunctionName: 'function:myFunctionName',
  DryRun: false,
  Publish: true,
  RevisionId: 'STRING_VALUE',
  ZipFile: data
};

const invokeAsyncCommand = new InvokeAsyncCommand(params);
lambda.send(invokeAsyncCommand).then(data => {
  log(chalk.green(data));
}).catch(error => {
  log(chalk.red(error));
});

But I am getting Cannot find module '@aws-sdk/client-lambda-node/InvokeAsyncCommand'

Module for SQS

I don't see any modules for working with SQS. Is it being planned?

fix: node-http-handler tests which require NODE_TLS_REJECT_UNAUTHORIZED=0

While updating jest v20->v24 in #243, some node-http-handler tests were commented out to unblock Jest upgrade

/*it("can send https requests", async () => {
const mockResponse = {
statusCode: 200,
headers: {},
body: "test"
};
mockHttpsServer.addListener(
"request",
createResponseFunction(mockResponse)
);
const nodeHttpHandler = new NodeHttpHandler();
let response = await nodeHttpHandler.handle(
{
hostname: "localhost",
method: "GET",
port: mockHttpsServer.address().port,
protocol: "https:",
path: "/",
headers: {}
},
{}
);
expect(response.statusCode).toEqual(mockResponse.statusCode);
expect(response.headers).toBeDefined();
expect(response.headers).toMatchObject(mockResponse.headers);
expect(response.body).toBeDefined();
});
it("can send requests with bodies", async () => {
const body = Buffer.from("test");
const mockResponse = {
statusCode: 200,
headers: {}
};
mockHttpsServer.addListener(
"request",
createResponseFunction(mockResponse)
);
const spy = jest.spyOn(https, "request").mockImplementationOnce(() => {
let calls = spy.mock.calls;
let currentIndex = calls.length - 1;
return https.request(calls[currentIndex][0], calls[currentIndex][1]);
});
const nodeHttpHandler = new NodeHttpHandler();
let response = await nodeHttpHandler.handle(
{
hostname: "localhost",
method: "PUT",
port: mockHttpsServer.address().port,
protocol: "https:",
path: "/",
headers: {},
body
},
{}
);
expect(response.statusCode).toEqual(mockResponse.statusCode);
expect(response.headers).toBeDefined();
expect(response.headers).toMatchObject(mockResponse.headers);
});
it("can handle expect 100-continue", async () => {
const body = Buffer.from("test");
const mockResponse = {
statusCode: 200,
headers: {}
};
mockHttpsServer.addListener(
"checkContinue",
createContinueResponseFunction(mockResponse)
);
let endSpy: jest.SpyInstance<any>;
let continueWasTriggered = false;
const spy = jest.spyOn(https, "request").mockImplementationOnce(() => {
let calls = spy.mock.calls;
let currentIndex = calls.length - 1;
const request = https.request(
calls[currentIndex][0],
calls[currentIndex][1]
);
request.on("continue", () => {
continueWasTriggered = true;
});
endSpy = jest.spyOn(request, "end");
return request;
});
const nodeHttpHandler = new NodeHttpHandler();
let response = await nodeHttpHandler.handle(
{
hostname: "localhost",
method: "PUT",
port: mockHttpsServer.address().port,
protocol: "https:",
path: "/",
headers: {
Expect: "100-continue"
},
body
},
{}
);
expect(response.statusCode).toEqual(mockResponse.statusCode);
expect(response.headers).toBeDefined();
expect(response.headers).toMatchObject(mockResponse.headers);
expect(endSpy!.mock.calls.length).toBe(1);
expect(endSpy!.mock.calls[0][0]).toBe(body);
expect(continueWasTriggered).toBe(true);
});
it("can send requests with streaming bodies", async () => {
const body = new ReadFromBuffers({
buffers: [
Buffer.from("t"),
Buffer.from("e"),
Buffer.from("s"),
Buffer.from("t")
]
});
let inputBodySpy = jest.spyOn(body, "pipe");
const mockResponse = {
statusCode: 200,
headers: {}
};
mockHttpsServer.addListener(
"request",
createResponseFunction(mockResponse)
);
const nodeHttpHandler = new NodeHttpHandler();
let response = await nodeHttpHandler.handle(
{
hostname: "localhost",
method: "PUT",
port: mockHttpsServer.address().port,
protocol: "https:",
path: "/",
headers: {},
body
},
{}
);
expect(response.statusCode).toEqual(mockResponse.statusCode);
expect(response.headers).toBeDefined();
expect(response.headers).toMatchObject(mockResponse.headers);
expect(inputBodySpy.mock.calls.length).toBeTruthy();
});*/

These tests need to be fixed

Update TypeScript dependency of packages to ^2.7

Some packages will fail to compile if they are using 2.6, even though that is their minimum version required. I think we don't see this issue more often because usually 2.7 is installed and hoisted.

This at least affects modeled-endpoint-middleware, but we should move all packages at once if possible.

Cognito login functionality

Hey there. I'm skimming over sdk-js-v3 and could not find the Login/Register/... functionality for Cognito user pools.

Currently, I can do those with the @aws-amplify/auth package. Did I miss something, or are functions missing?

Use dependabot instead of greenkeeper for automated dependency management

Is your feature request related to a problem? Please describe.
We're currently using greenkeeper for automated dependency management. It's good, but dependabot seems much better.

Describe the solution you'd like
Use dependabot for automated dependency management as:

Describe alternatives you've considered
Greenkeepr/Renovate

Opt out of minor and/or patch updates in automated dependency management

Is your feature request related to a problem? Please describe.

  • We've been using Greenkeeper for automated dependency management 2019/05/16 (original PR #248)
  • As of 2019/07/27, it has created 12 PRs as follows:
    • 2 initial PRs
    • 5 major version updates
    • 1 minor version update
    • 4 patch updates
    • we've disabled updates for some dependencies (link)
  • Find a way to opt out of minor version or patch updates (except if it's a security issue). Reason: they're not worth the time spent in reviewing PRs and testing.
    • We use carets (doc) for most of our dependencies, so latest “Compatible with version” will be installed
    • We plan to commit yarn lockfile once we move to yarn workspaces

Describe the solution you'd like
Opt out of minor and/or dependency updates:

Describe alternatives you've considered
Reviewing/ignoring patch version updates PRs from greenkeeper

EDITs:

  • (trivikr) Updated issue link from greenkeeper repo

TypeError in JsonBuilder using DynamoDB map

Hi, I already had success DynamoDB function putItem, but when I want my Item to contain a map (M), I get an error: TypeError: Cannot read property 'type' of undefined

Code example:

const args: PutItemInput = {
	TableName: 'myTableName',
	Item: {
		email: { S: 'xxx@xxx' },
		example: {
			M: {
				email: { S: 'hello' },
			}
		},
	},
	ConditionExpression: 'attribute_not_exists(email)',
};
return client.putItem(args);

I also tried representing the map as an array:

const args: PutItemInput = {
	TableName: 'myTableName',
	Item: {
		email: { S: 'xxx@xxx' },
		example: {
			M: [ 'email', { S: 'hello' } ],
		},
	},
	ConditionExpression: 'attribute_not_exists(email)',
};

Full error:

    TypeError: Cannot read property 'type' of undefined

      at JsonBuilder.Object.<anonymous>.JsonBuilder.format (node_modules/@aws-sdk/json-builder/src/index.ts:41:19)
      at JsonBuilder.Object.<anonymous>.JsonBuilder.format (node_modules/@aws-sdk/json-builder/src/index.ts:103:34)
      at JsonBuilder.Object.<anonymous>.JsonBuilder.format (node_modules/@aws-sdk/json-builder/src/index.ts:65:47)
      at JsonBuilder.Object.<anonymous>.JsonBuilder.format (node_modules/@aws-sdk/json-builder/src/index.ts:103:34)
      at JsonBuilder.Object.<anonymous>.JsonBuilder.format (node_modules/@aws-sdk/json-builder/src/index.ts:65:47)
      at JsonBuilder.Object.<anonymous>.JsonBuilder.build (node_modules/@aws-sdk/json-builder/src/index.ts:36:36)
      at JsonRpcSerializer.Object.<anonymous>.JsonRpcSerializer.serialize (node_modules/@aws-sdk/protocol-json-rpc/src/JsonRpcSerializer.ts:33:39)
      at Object.<anonymous> (node_modules/@aws-sdk/middleware-serializer/src/index.ts:24:43)
      at step (node_modules/tslib/tslib.js:133:27)
      at Object.next (node_modules/tslib/tslib.js:114:57)
      at fulfilled (node_modules/tslib/tslib.js:104:62)

Used SDK version: 0.1.0-preview.1

package-generator generate client will fail if /tmp is on a different device than the target directory.

Describe the bug
package-generator generate client will fail if /tmp is on a different device than the target directory.

You cannot rename a file across partitions or devices.
https://github.com/aws/aws-sdk-js-v3/blob/master/packages/package-generator/src/importModule.ts#L44

Error: EXDEV: cross-device link not permitted, rename '/tmp/pk9bBD' -> '/local/target'

SDK version number
"@aws-sdk/package-generator": "0.1.0-preview.2"

Is the issue in the browser/Node.js?
build-tool, linux, but I think windows too.

To Reproduce (observed behavior)
Use a computer with more than one device.
Run the ./generator client while on a different device than /tmp.

"Advanced configuration has been deprecated" Validation error on precommit

Describe the bug
This bug is related to precommit and not SDK.
The following error is thrown when git commit is run on the repository

Could not parse lint-staged config.

        Error: ● Validation Error:

Invalid value for 'linters'.

Advanced configuration has been deprecated. For more info, please visit: https://github.com/okonet/lint-staged.

Configured value is: {'packages/**/*.{ts,md,json}': ['prettier --write', 'git add']}

To Reproduce (observed behavior)
Make some changes to any file, run git add and git commit
The Validation error will be thrown, prettier won't run and no commit will be created

Expected behavior
Prettier runs on precommit and commit gets created as described in #225

Screenshots
prettier-precommit-error

Additional context

How to build packages for contributing?

Hej folks,

I've seen #153 while wondering about missing packages and found the answer explaining how to generate a package based on the provided models.

So what I did was

$ npm install
$ npm run bootstrap
$ node ./packages/package-generator/build/cli.js client --m ../../models/s3/2006-03-01/service-2.json --r node --s ../../models/s3/2006-03-01/smoke.json

This command fails with

Cannot find module '/Users/stephan/dev/github/aws-sdk-js-v3/packages/package-generator/build/cli.js'

as the build folder does not exist.

The package does not seem to have a specific build script and builds in prepublishOnly.
I checked and prepublish lifecycle hook is called during lerna bootstrap:

npm run bootstrap

> [email protected] bootstrap /Users/stephan/dev/github/aws-sdk-js-v3
> lerna bootstrap

lerna info version 2.11.0
lerna info versioning independent
lerna info Bootstrapping 117 packages
lerna info lifecycle preinstall
lerna WARN EHOIST_PKG_VERSION "@aws-sdk/credential-provider-cognito-identity" package depends on jest@^21, which differs from the hoisted jest@^20.0.4.
lerna info Installing external dependencies
lerna info hoist Installing hoisted dependencies into root
lerna info hoist Pruning hoisted dependencies
lerna info hoist Finished pruning hoisted dependencies
lerna info hoist Finished installing in root
lerna info Symlinking packages and binaries
lerna info lifecycle postinstall
lerna info lifecycle prepublish
lerna info lifecycle prepare
lerna success Bootstrapped 117 packages

Then I thought I'll just run tsc manually for this package:

$ lerna exec --scope @aws-sdk/package-generator -- tsc

This fails as the TypeScript compiler does not find @aws-sdk/build-types and others. So the symlinking step might not have been successful as well.

How do you all usually work with this repository? Am I missing something? (My monorepo experience is quite limited)

Looking forward to play around with the new shiny SDK!

Suggestion: Directly exporting service actions

At the moment the packages export a similar structure to v2, a service class with methods for each action. In theory, you can do better at tree-shaking and possibly make test mocking easier with a direct function export for each action taking a configuration or context as a parameter, e.g., instead of:

import { Foo } from "@aws-sdk/client-foo-node"

let foo: Foo;
export function init(config) {
    const foo = new Foo(config);
}

export async function action(params) {
    await foo.bar(params);
}

doing something like:

import * as Foo from "@aws-sdk/client-foo-node"

let client: Foo.FooClient;
export function init(config) {
    client = new Foo.FooClient(config);
}
export async function action(params) {
    await Foo.bar(context, params);
}

I'm not sure exactly how much impact this could have, but it looks like it could let an optimiser remove all of the unused command/* and model/* classes. Since the usages I've seen in our company have been very commonly a small handful of actions spread over multiple services, it seems like this could be a pretty good win.

Obviously, this means testing can't just pass a mock client object to the tested methods, but it's pretty standard usage to init the client in the module of the unit to be tested (like above), which means your test mocking would already need to handle mocking the client creation and the method already, so this would be about the same.

Using a fake mocking library for illustration, this would require a particularly nosy test to change from:

// ...
class MockFoo {
  constructor(public config) {}
  bar = mock.fn(1);
}

test("action1 calls bar on arg client", async () => {
  const { action1 } = await import("./testee");
  const client = new MockFoo();
  await action1(client, mock.object(1));
  assert(mock.fn.single.calls.single.args(client, mock.object(1)));
});

test("action2 calls bar on module client", async () => {
  mock.module("@aws-sdk/client-foo-node", {
    Foo: MockFoo,
  });
  const { init, action2 } = await import("./testee");
  init(mock.object(1));
  await action2(mock.object(2));
  assert(mock.fn.single.calls.single.args(
    mock.instance(MockFoo, { config: mock.object(1) }),
    mock.object(2)));
});

to:

// ...
class MockFoo {
  constructor(public config) {}
}
test("action1 calls bar on arg client", async () => {
  mock.module("@aws-sdk/client-foo-node", {
    bar: mock.fn(1),
  });
  const { action1 } = await import("./testee");
  const client = new MockFoo();
  await action1(client, mock.object(1));
  assert(mock.fn(1).calls.single.args(undefined, client, mock.object(1)));
});

test("action calls bar on module client", async () => {
  mock.module("@aws-sdk/client-foo-node", {
    Foo: MockFoo,
    bar: mock.fn(1),
  });
  const { init, action2 } = await import("./testee");
  init(mock.object(1));
  await action2(mock.object(2));
  assert(mock.fn(1).calls.single.args(
    undefined,
    mock.instance(MockFoo, { config: mock.object(1) }),
    mock.object(2)));
});

That is, it allows decoupling mocking the client from mocking the action at the cost of now requiring module mocking for users that were passing the client as an arg before. In my experience, this would be a good trade-off, but it would be good to get more feedback.

A compromise would be to document the effect of the action function on the client (currently, send()?), so the test could only mock the client and assert the expected calls, but that seems like it could be restrictive?

This might also allow creating and sharing the middleware directly between multiple services, if they could be compatible: e.g. multiple JSON-based services in the same region, enforced by the typing system, but I'm not sure if that's reasonable?

Update crypto-supports-webCrypto package name

NPM doesn't allow publishing packages that have capital letters in the package name. If we need to publish this package, the name will need to be updated, as will every reference to this package.

middleware-stack test file compile fails in typescript 3.2

Because typescript 3.2 introduces the new feature of type checking in bind, call, and apply, typing issues not caught before are revealed.

issue:

@aws-sdk/[email protected] pretest /Users/zheallan/workspace/npmProj/v3/aws-sdk-js-v3/packages/middleware-stack
tsc -p tsconfig.test.json

src/index.spec.ts(51,23): error TS2345: Argument of type '((next: Handler<string[], object>) => Handler<string[], object>) | { priority: number; } | { step: string; }' is not assignable to parameter of type 'FinalizeMiddleware<string[], object, Uint8Array>'.
Type '(next: Handler<string[], object>) => Handler<string[], object>' is not assignable to type 'FinalizeMiddleware<string[], object, Uint8Array>'.
Types of parameters 'next' and 'next' are incompatible.
Types of parameters 'args' and 'args' are incompatible.
Property 'request' is missing in type 'HandlerArguments<string[]>' but required in type 'FinalizeHandlerArguments<string[], Uint8Array>'.
src/index.spec.ts(96,25): error TS2345: Argument of type '(next: Handler<string[], object>) => Handler<string[], object>' is not assignable to parameter of type 'FinalizeMiddleware<string[], object, Uint8Array>'.
src/index.spec.ts(98,13): error TS2345: Argument of type '(next: Handler<string[], object>) => Handler<string[], object>' is not assignable to parameter of type 'FinalizeMiddleware<string[], object, Uint8Array>'.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.