Coder Social home page Coder Social logo

s3's People

Contributors

aakifn avatar bcachet avatar laoneo avatar nikosdion avatar ryandemmer avatar tampe125 avatar wilsonge avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

s3's Issues

Question: Syntax for Creating Folders

I apologize, this isn't an issue per se, I'm hoping you could point me in the right direction to use this awesome script to create a folder within a bucket and or bucket / path (sub-folder)?

Check if an object exists on S3

Hi, is there a way to check if an object (file) does exist in my bucket? I see Connector::getObject but that seems to download the file while I just need something like objectExists(string $path):bool
Thanks!

Receiving SignatureDoesNotMatch errors after upgrading from 2.0.0

We use this S3 connector over at Friendica and we recently upgraded the library from version 2.0.0 to version 2.3.1 to suppress Deprecation messages, but unfortunately our users have been reporting errors since.

The code using the connector didn't change, only the library was upgraded. Do you know if any changes in the library would explain this unexpected behavior?

Uncaught Exception Friendica\Core\Storage\Exception\StorageException: "Cannot put data for reference 6e20c847806569877e01306cd884b0be47292dbf059be105cbce272e65c95f05" at /s3_storage/src/S3Client.php line 88

Akeeba\S3\Exception\CannotPutFile: Akeeba\S3\Connector::putObject(): [500] SignatureDoesNotMatch:The request signature we calculated does not match the signature you provided. Check your key and signing method.

Debug info:
SimpleXMLElement Object
(
    [Code] => SignatureDoesNotMatch
    [Message] => The request signature we calculated does not match the signature you provided. Check your key and signing method.
    [Key] => 6e/20
    [BucketName] => 
    [Resource] => /6e/20/c847806569877e01306cd884b0be47292dbf059be105cbce272e65c95f05
    [RequestId] => 17A3F499D8B38E7E
    [HostId] => d766ec9d-ed08-47e8-bb7c-f968dbd4f006
)
 in /addon/s3_storage/vendor/akeeba/s3/src/Connector.php:149
Stack trace:
#0 /addon/s3_storage/src/S3Client.php(85): Akeeba\S3\Connector->putObject()
#1 /src/Model/Photo.php(454): Friendica\Addon\s3_storage\src\S3Client->put()
...
#18 /index.php(46): Friendica\App->runFrontend()
#19 {main}

Next Friendica\Core\Storage\Exception\StorageException: Cannot put data for reference 6e20c847806569877e01306cd884b0be47292dbf059be105cbce272e65c95f05 in /addon/s3_storage/src/S3Client.php:88
Stack trace:
#0 /src/Model/Photo.php(454): Friendica\Addon\s3_storage\src\S3Client->put()
...
#17 /index.php(46): Friendica\App->runFrontend()
#18 {main}

getAuthenticatedURL with HTTPS?

Is there a way for the getAuthenticatedURL to provide an HTTPS link to the file on S3 so the data is encrypted with SSL?

Like:
https://s3-us-east-1.amazonaws.com/documents.mybucket/file.pdf?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=blahblahblah

Instead of:
http://s3-us-east-1.amazonaws.com/documents.mybucket/file.pdf?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=blahblahblah

(The HTTPS link works, but the function returns a non-HTTPS version.)

(I suppose I could do a string replace and change it, but was curious if there's a built-in parameter for SSL.)

Improvement: Response headers handled in a case sensitive way

Hello,

In Request module, you parse headers (__responseHeaderCallback) in a case-sensitive way.
You do not enforce CURL to use HTTP 1.1 Headers in HTTP2.0 are lower case.
You can endup in not parsing the headers correctly.

protected function  __responseHeaderCallback(&$curl, &$data)
	{
		if (($strlen = strlen($data)) <= 2)
		{
			return $strlen;
		}

		if (substr($data, 0, 4) == 'HTTP')
		{
			$this->response->code = (int)substr($data, 9, 3);

			return $strlen;
		}

		list($header, $value) = explode(': ', trim($data), 2);

		switch (strtolower($header))
		{
			case 'last-modified':
				$this->response->setHeader('time', strtotime($value));
				break;

			case 'content-length':
				$this->response->setHeader('size', (int)$value);
				break;

			case 'content-type':
				$this->response->setHeader('type', $value);
				break;

			case 'etag':
				$this->response->setHeader('hash', $value{0} == '"' ? substr($value, 1, -1) : $value);
				break;

			default:
				if (preg_match('/^x-amz-meta-.*$/', $header))
				{
					$this->setHeader($header, is_numeric($value) ? (int)$value : $value);
				}
				break;
		}

		return $strlen;
	}

Hope it helps

Error::setCode() throws TypeError

Sometimes, in an unpredictable way, I get the following error:

Fatal Error: Uncaught TypeError: Akeeba\Engine\PostProc\Connector\S3v4\Response\Error::setCode(): Argument #1 ($code) must be of type int, string given
#0 vendor/akeeba/s3/src/Response/Error.php (52): Akeeba\Engine\PostProc\Connector\S3v4\Response\Error->setCode()
#1 vendor/akeeba/s3/src/Response/Response.php (336): Akeeba\Engine\PostProc\Connector\S3v4\Response\Error->__construct()
#2 vendor/akeeba/s3/src/Response/Response.php (177): Akeeba\Engine\PostProc\Connector\S3v4\Response\Response->parseBody()
#3 vendor/akeeba/s3/src/Response/Request.php (541): Akeeba\Engine\PostProc\Connector\S3v4\Response\Response->finaliseBody()
#4 vendor/akeeba/s3/src/Response/Connector.php (820): Akeeba\Engine\PostProc\Connector\S3v4\Response\Request->getResponse()
#5 .... Akeeba\Engine\PostProc\Connector\S3v4\Response\Connector->uploadMultipart()
#0 ....: AmazonS3Light->load()

It looks like this happens when I call my custom method that is supposed to upload a file to S3.
Did I make any mistakes here?


use Akeeba\Engine\Postproc\Connector\S3v4\Configuration;
use Akeeba\Engine\Postproc\Connector\S3v4\Connector;
use Akeeba\Engine\Postproc\Connector\S3v4\Input;

class AmazonS3Light{
...
...
...
    public function load(string $source_file, string $destination_file = '') :string
    {
        $input                  = Input::createFromFile($source_file);
        $upload_id           = $this->connector->startMultipart($input, $this->bucket, $this->destination_file);
        $e_tags                 = [];
        $e_tag                  = null;
        $part_number      = 0;

        do
        {
            $input = Input::createFromFile($source_file);
            $input->setUploadID($upload_id);
            $input->setPartNumber(++$part_number);
           
            $e_tag = $this->connector->uploadMultipart($input, $this->bucket, $this->destination_file);
           
            if (!is_null($e_tag))
            {
                $e_tags[] = $e_tag;
            }
        }
        while (!is_null($e_tag));

        $input = Input::createFromFile($source_file);
        $input->setUploadID($upload_id);
        $input->setEtags($e_tags);
        $this->connector->finalizeMultipart($input, $this->bucket, $this->destination_file);

        return $this->destination_file;
          
    }
}

Many thanks!!

PHP warnings after last update

Warning: Class "Akeeba\S3\Acl" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Configuration" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Connector" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Input" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Request" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Response" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Signature" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\StorageClass" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\CannotDeleteFile" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\CannotGetBucket" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\CannotGetFile" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\CannotListBuckets" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\CannotOpenFileForRead" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\CannotOpenFileForWrite" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\CannotPutFile" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\ConfigurationError" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\InvalidAccessKey" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\InvalidBody" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\InvalidEndpoint" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\InvalidFilePointer" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\InvalidRegion" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\InvalidSecretKey" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\InvalidSignatureMethod" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Exception\PropertyNotFound" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Response\Error" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Signature\V2" not found in .../akeeba/s3/src/aliasing.php on line 45
Warning: Class "Akeeba\S3\Signature\V4" not found in .../akeeba/s3/src/aliasing.php on line 45

The problem is that if original class isn't loaded yet then class_alias in aliasing.php file will fail, since it's not pre-loading missing classes by default, changing class_alias to preload class fixes the problem, but it loads all classes upon composer autoload inclusion which maybe not desirable for speed reasons. It would be great if old class paths would be loaded by demand instead.

Viewing a folder from S3

Good day,

What I am trying to do is to upload a folder in S3, this folder will be protected so cannot be accessed directly, then I want to use a script that is hosted somewhere else to read the index file of that folder and view it on the website that is hosted at the other location as if it is on that server and of course the index file will be linked to many other files in that folder which should all work as a view from the hosted pages externally.

This process will be repeated to many other folders, which will create the need for an interface to manage S3 views on that website, but they all would have the same API details.

Uploading files through the script is not much of important to me.

Can this script work like this? Which part of it?

Would you kindly help me?

Sorry that I had to post here, but did not find any contact, help or description.

Thanks a lot in advance,

getBucket() $maxKeys usage

Nicholas, I do apologize if I wasn't clear on my previous request
Maybe this code will help:

Note: I'm using dev-development f0f6554 version with PHP 8.0.6

I have 1001 files in a bucket/folder. This code will never return (I run it for more than 40 minutes and it's still going). With 999 files it's super fast (les than 1 sec).
Could you please test and tell me what is my mistake?
Could you please give me a code fix proposal to read 1001 files?

$configuration = new Configuration(
                                   S3KEY,
                                   S3SECRET,
                                   'v4',
                                   S3REGION
                                  );
$connector     = new Connector($configuration);
$folder        = "test";
$bucket        = S3BUCKET;
$folder        = trim($folder, '/');

$files      = $connector->getBucket($bucket, $folder.'/', null, null);

getBucket() looks like there is a limit of 999 files

I tried to use getBucket() to read the content of a folder into S3 Bucket.
If there are more than 999 files, the function can't return and hangs indefinitely. I tried also with empty files. Same issue.
Is there a limitation of number of files getBucket() can list?

Support for DreamObject s3

Your code does not look like the security key generated.

The URL must not include HTTP. This looks to be developed in S3Filesystem.php getFormConnection

The code on Request.php does not need to add the bucket to the URL, like line 672.

Also, I have been trying to get the thumbnails to work right, as it keeps letting them oversize.

Let me know what you would like me to test the necessary info.

Cannot upload empty files - Missing input parameters

I'm trying to use putObject method to upload small files into my S3 bucket. The non-empty files will get uploaded correctly, while the empty ones will not get uploaded. Is it possible to upload empty files?
Here is my code:

use Akeeba\Engine\Postproc\Connector\S3v4\Configuration;
use Akeeba\Engine\Postproc\Connector\S3v4\Connector;
use Akeeba\Engine\Postproc\Connector\S3v4\Input;

defined('AKEEBAENGINE') or define('AKEEBAENGINE', 1);
define('S3KEY',    '------------');
define('S3SECRET', '-----------');
define('S3REGION', 'us-east-1');
define('S3BUCKET', 'mybucket');

$configuration = new Configuration( S3KEY, S3SECRET, 'v4', S3REGION);
$connector      = new Connector($configuration);
$folder             = "uat";
$bucket            = S3BUCKET;

// ------------- THIS WORKS ----------------
$file          = "non-empty-file.csv"; // ~15bytes
$input       = Input::createFromFile($file);

try
{
    $connector->putObject($input, $bucket, $folder."/non-empty-file.csv");
}
catch(Exception $e)
{
    echo "\n Exception: ".$e->getMessage();
}


// ------------- THIS GENERATES AN [Missing input parameters] EXCEPTION ----------------
$file          = "empty-file.csv"; // 0bytes
$input       = Input::createFromFile($file);

try
{
    $connector->putObject($input, $bucket, $folder."/empty-file.csv");
}
catch(Exception $e)
{
    echo "\n Exception: ".$e->getMessage();
}

getObject cannot get XML files

If I attempt to get an XML file from a bucket, the Response class will think the XML is an error message from amazon and will parse the body into a SimpleXMLElement and then somehow reset the actual body content to the last received chunk, failing to return the correct xml content.

Saving to a file still works because that happened during the response, but returning as a string fails to return the expected content.

Upgrading from version 2.0

Description:
I am having this issue while trying to validate upgrading this lib from v2.0 for an external project. A non AWS provider is used. From debugging the Request object created, I found the following headers:

Array
(
    [0] => x-amz-acl: public-read
    [1] => Host: storage:9090
    [2] => Date: Sun, 04 Feb 2024 22:43:34 +0000
    [3] => Content-MD5: aLRqb9GImLBE62a9FL7opQ==
    [4] => Content-Type: image/jpeg
    [5] => Content-Length: 9111
    [6] => Authorization: AWS access_key:XSdth2mvbuJqcif1JX9bF1q5byc=
)

Fatal error:
Uncaught Akeeba\S3\Exception\CannotPutFile: Akeeba\S3\Connector::putObject(): [500] AccessDenied:AWS authentication requires a valid Date or x-amz-date header

Debug info:
SimpleXMLElement Object

(
    [Code] => AccessDenied
    [Message] => AWS authentication requires a valid Date or x-amz-date header
    [RequestId] => tx54473b1f03e048108d92a-0065c00ea0
)

Stack trace:
in /var/www/html/vendor/akeeba/s3/src/Connector.php:149
Akeeba\S3\Connector->putObject()

Add an option to switch between `X-Amz-Date` and `Date`

Some third party S3 implementations only expect the X-Amz-Date custom HTTP header as part of the request (and the signature, especially V2), whereas we send the standard HTTP Date header. We can't revert that change because it will break other third party implementations. Therefore we need a switch.

This is a problem we have seen in several places, e.g. #34 and a few (private) tickets over the last few months.

Solved: PHP Config Issue - small file upload works, files over 2MB fail

I can upload smaller files (200KB) flawlessly, but when I try a file such as a digital camera picture (3MB - still not very big) then I get an error "cannot open for reading" thrown by CannotOpenFileForRead.php Exception.)

I can't see RAM being an issue for a 3MB file (I have 256MB allocated to PHP), AWS server with no production workload.

Here's the portion of code I'm using - works for small files, not for 3MB file for instance.

$source_file = $_FILES['file']['tmp_name'];
$input = Input::createFromFile($source_file);
$connector->putObject($input, $document_bucket, $file_with_path);

Any ideas what config or such I might have set wrong? I assumed the Multipart shouldn't be required for files of this size, since it's only chunking into 5MB anyways.

Presigned upload URLS

Unrelated Hi @nikosdion, love the comments about bloat and glad you wrote this library, so I don't have to import the insane aws sdk. Regarding your Cloudfront costs, maybe Cloudflare R2 could be cheap for you. No egress cost! :)

There is one feature I am currently missing, which are presigned upload URLs, so a client can directly upload to S3 without going through a backend. Great for i.e. uploading large media files. I have been poking around in the Signature Class, but I am not quite sure yet if this is an actually different feature or could simply be merged into the getAuthenticatedURL functionality or even simply by adding a few custom headers. Thanks in advance!

Docs for reference: https://docs.aws.amazon.com/AmazonS3/latest/userguide/PresignedUrlUploadObject.html

getBucket() and file timestamp

I'm using the getBucket() method to list the content of a folder in a bucket. Once I get the list of files, I need to check the last modified datetime. The array returned by getBucket() has a "time" element but this doesn't look like the last modified date of the file but more the time when the object has been created in the folder.
I try to explain myself a bit better.

If you have a file in a bucket/folder created on 2021-01-29 10:00:00 and you copy it into another folder in S3 on 2021-11-22 07:30:00 then getBucket() returns the timestamp of 2021-11-22 07:30:00 (create date of the file in the new folder) and not the original 2021-01-29 10:00:00 (original create date)

I see the "time" comes from

'time' => strtotime((string) $c->LastModified)

Is there any possibility to get the real file create datetime and not the modified one?

Any way of testing from localhost? [Question]

This setup works great for me when running on an AWS instance, using:

    $role = file_get_contents('http://169.254.169.254/latest/meta-data/iam/security-credentials/');
    $jsonCredentials = file_get_contents('http://169.254.169.254/latest/meta-data/iam/security-credentials/' . $role);
    $credentials = json_decode($jsonCredentials, true);
    $configuration = new Configuration(
    $credentials['AccessKeyId'],
    $credentials['SecretAccessKey'],
    'v4',
    AWS_REGION
    );
    $configuration->setToken($credentials['Token']);
    $connector = new Connector($configuration);

However, I'm wondering if there's any easy way of testing with it on localhost. (Since it's only one small component of the application, and when making changes, it's nice to test everything before it hits the server.)

Setting up an Access Key / Secret from AWS Console doesn't provide a token. So I tried for instance running this on a test page on a AWS instance:

$role = file_get_contents('http://169.254.169.254/latest/meta-data/iam/security-credentials/');
$jsonCredentials = file_get_contents('http://169.254.169.254/latest/meta-data/iam/security-credentials/' . $role);
$credentials = json_decode($jsonCredentials, true);

echo $credentials['AccessKeyId'];
echo '<br />';
echo $credentials['SecretAccessKey'];
echo '<br />';
echo $credentials['Token'];

Which then gave me the 3 strings. (And I realize they'll have a fairly short lifespan) Then I tried passing them like this:


    $configuration = new Configuration(
    'AccessKeyIdHere',
    'SecretAccessKeyHere',
    'v4',
    'AWS_Region_Here'
    );
    $configuration->setToken('Token_Provided_Here');
    $connector = new Connector($configuration);

But when I try to use it, I get the following error:

Fatal error: Uncaught Akeeba\Engine\Postproc\Connector\S3v4\Exception\CannotGetBucket: Akeeba\Engine\Postproc\Connector\S3v4\Connector::internalGetBucket(): [60] SSL certificate problem: unable to get local issuer certificate in C:\project\vendor\akeeba\s3\src\Connector.php:935 Stack trace: #0 C:\project\vendor\akeeba\s3\src\Connector.php(460): Akeeba\Engine\Postproc\Connector\S3v4\Connector->internalGetBucket('BUCKETNAME...', 'folder...', NULL, NULL, '/', false) #1 C:\project\handlers\media_upload.php(83): Akeeba\Engine\Postproc\Connector\S3v4\Connector->getBucket('BUCKETNAME...', 'folder...') #2 {main} thrown in C:\project\vendor\akeeba\s3\src\Connector.php on line 935

SSL certificate problem: unable to get local issuer certificate

I have a CURL pem certificate package setup with my localhost PHP / Apache setup, but I'm obviously missing something.

There's not quite the same options as the bloated AWS SDK, which appears to have some provision for this:

https://stackoverflow.com/questions/24620393/aws-ssl-security-error-curl-60-ssl-certificate-prob-unable-to-get-local

I get the same error using this method as outlined in the Readme:

$configuration = new Configuration(
	'YourAmazonAccessKey',
	'YourAmazonSecretKey'
);

$connector = new \Connector($configuration);

If I add:
$configuration->setSSL(false);

Then I get:

[500] InvalidRequest:The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256

Has anyone else used this package with localhost and have any suggestions?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.