Coder Social home page Coder Social logo

azure-storage-ruby's Introduction

Microsoft Azure Storage Client Library for Ruby (Deprecated)

This project will be in Community Support until 13 September 2024. After this date the project and associated client libraries will be retired permanently. For more details on the retirement and alternatives to using this project, visit Retirement notice: The Azure Storage Ruby client libraries will be retired on 13 September 2024.


  • Master: Master Build Status Coverage Status
  • Dev: Dev Build Status Coverage Status

This project provides Ruby packages that makes it easy to access and manage Microsoft Azure Storage Services.

Library Packages

Note:

  • x64 Ruby for Windows is known to have some compatibility issues.
  • Each service gems depends on gem nokogiri. For Ruby version lower than 2.2, please install the compatible nokogiri before trying to install azure-storage.

Getting Started for Contributors

If you would like to become an active contributor to this project please follow the instructions provided in Azure Projects Contribution Guidelines. You can find more details for contributing in the CONTRIBUTING.md.

Provide Feedback

If you encounter any bugs with the library please file an issue in the Issues section of the project.

Azure Storage SDKs and Tooling

Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

azure-storage-ruby's People

Contributors

abelhu avatar andyliuliming avatar ball-hayden avatar bsawicki avatar c-w avatar flavorjones avatar ganesh-kumolus avatar graf avatar jeltef avatar kakuzei avatar katmsft avatar kule avatar makhdumi avatar masarakki avatar mattt avatar microsoft-github-policy-service[bot] avatar namiwang avatar schoag-msft avatar seanmcc-msft avatar vinjiang avatar vishrutshah avatar wonda-tea-coffee avatar yaxia avatar yovasx2 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

azure-storage-ruby's Issues

[0.11.4]: Does not support to set 'User-Agent'

azure-storage-ruby 0.11.4 does not support to set 'User-Agent'. It always use the default 'User-Agent'.

def common_headers(options = {})
headers = {
'x-ms-version' => Azure::Storage::Default::STG_VERSION,
'User-Agent' => Azure::Storage::Default::USER_AGENT
}
headers.merge!({'x-ms-client-request-id' => options[:request_id]}) if options[:request_id]
headers
end

        def common_headers(options = {})
          headers = {
            'x-ms-version' => Azure::Storage::Default::STG_VERSION,
            'User-Agent' => Azure::Storage::Default::USER_AGENT
          }
          headers.merge!({'x-ms-client-request-id' => options[:request_id]}) if options[:request_id]
          headers
        end

Singing key error

using azure lib 0.7.6

/Users/micahsmith/.rbenv/versions/2.2.2/lib/ruby/gems/2.2.0/gems/azure-core-0.1.5/lib/azure/core/auth/signer.rb:32:in `initialize': Signing key must be provided (Argume
ntError)
        from /Users/micahsmith/.rbenv/versions/2.2.2/lib/ruby/gems/2.2.0/gems/azure-core-0.1.5/lib/azure/core/auth/shared_key.rb:33:in `initialize'
        from /Users/micahsmith/.rbenv/versions/2.2.2/lib/ruby/gems/2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/blob/blob_service.rb:43:in `new'
        from /Users/micahsmith/.rbenv/versions/2.2.2/lib/ruby/gems/2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/blob/blob_service.rb:43:in `initialize'
        from upload.rb:12:in `new'
        from upload.rb:12:in `<main>'

and the following code:

client = Azure::Storage::Client.create(:storage_account_name => accountName, :storage_access_key => storageKey)

## throws here
@blobs = Azure::Storage::Blob::BlobService.new

Relax JSON 1.x

The library has a hard dependency on the 1.x JSON gem, which breaks newer gems that are using 2.x JSON. Can the dependency be relaxed or removed from the gemspec?

Provide more details for the errors

In many cases, you get the generic error "one of the request inputs is not valid", which is returned by the REST API, but the API also returns a detail of which field or input is not valid. I am not seeing that being returned in the exception message.

Error when creating a storage client

When executing client = Azure::Storage.create(:storage_account_name => "###", :storage_access_key => "###")

receiving

Azure::Storage::InvalidOptionsError: options provided are not valid set: {}

Setup is working...

Thanks

-pankaj

Unable to send large files via #create_block_blob

When attempting to send a large (50MB+) file, things don't quite work :(

/home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/http/http_request.rb:147:in `call': RequestBodyTooLarge (413): The request body is too large and exceeds the maximum permissible limit. (Azure::Core::Http::HTTPError)
RequestId:7c1c2e38-0001-00ad-36b4-38a130000000
Time:2016-11-07T05:06:18.1887668Z
        from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/http/signer_filter.rb:28:in `call'
        from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/http/signer_filter.rb:28:in `call'
        from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/http/http_request.rb:104:in `block in with_filter'
        from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/service.rb:36:in `call'
        from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/filtered_service.rb:34:in `call'
        from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/signed_service.rb:41:in `call'
        from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/service/storage_service.rb:52:in `call'
        from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/blob/blob_service.rb:59:in `call'
        from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/blob/block.rb:114:in `create_block_blob'
        from blob-sync:50:in `block (2 levels) in <main>'
        from /home/tyler/.rvm/rubies/ruby-2.2.0/lib/ruby/2.2.0/set.rb:283:in `each_key'
        from /home/tyler/.rvm/rubies/ruby-2.2.0/lib/ruby/2.2.0/set.rb:283:in `each'
        from blob-sync:45:in `block in <main>'
        from blob-sync:17:in `each'
        from blob-sync:17:in `<main>'

blob download using the ruby azure-storage gem is failing due to 'unknown encoding name' ArgumentError

here is what I am doing

blob service initialization

Azure::Storage::setup(:storage_account_name => @account_name, 
                      :storage_access_key => @access_key)
@blob_service = Azure::Storage::Blob::BlobService.new

...

attempt to download a blob where the deconstructed[:blob_name] is the name of the blob.

blob_factual,` content = @blob_service.get_blob(container_name, deconstructed[:blob_name], { :timeout => timeout })
File.open("#{deconstructed[:target_dir]}/#{deconstructed[:blob_name]}", "wb") { |f| f.write(content) }

I hit the following error:

WARN: Unable to download: 00094811-D064-41D1-9F98-5BF3C24F05FC due to argument error.
unknown encoding name - application/octet-stream
/usr/local/var/rbenv/versions/2.2.1/lib/ruby/gems/2.2.0/gems/azure-storage-0.11.0.preview/lib/azure/storage/blob/blob_service.rb:63:in force_encoding'\n/usr/local/var/rbenv/versions/2.2.1/lib/ruby/gems/2.2.0/gems/azure-storage-0.11.0.preview/lib/azure/storage/blob/blob_service.rb:63:incall'\n/usr/local/var/rbenv/versions/2.2.1/lib/ruby/gems/2.2.0/gems/azure-storage-0.11.0.preview/lib/azure/storage/blob/blob.rb:89:in get_blob'\nmove_blobs_azure.rb:63:indownload_blob'\nmove_blobs_azure.rb:49:in block in get_blobs_from_container'\nmove_blobs_azure.rb:45:ineach'\nmove_blobs_azure.rb:45:in get_blobs_from_container'\nmove_blobs_azure.rb:138:inblock in

'\nmove_blobs_azure.rb:135:in each'\nmove_blobs_azure.rb:135:in'

it appears to be failing at a call to .force_encoding(...)

SHOULD NOT retry when Azure returns 404 NOT_FOUND

blob_client.get_blob_properties(container_name, name, options)

When I tried above command for a non-exist blob, I got the result after a long time. But if I comment below code, I got the result very soon. It seems like that sdk retries when receiving 404.

blob_client.with_filter(Azure::Storage::Core::Filter::ExponentialRetryPolicyFilter.new)

Mixed camel case naming in lib/azure/storage/client.rb

Mixed camel case naming in lib/azure/storage/client.rb

xxxClient => xxx_client

    def blobClient(options = {})
      @blobClient ||= Azure::Storage::Blob::BlobService.new(default_client(options))
    end

    # Azure Queue service client configured from this Azure Storage client instance
    # @return [Azure::Storage::Queue::QueueService]
    def queueClient(options = {})
      @queueClient ||= Azure::Storage::Queue::QueueService.new(default_client(options))
    end

    # Azure Table service client configured from this Azure Storage client instance
    # @return [Azure::Storage::Table::TableService]
    def tableClient(options = {})
      @tableClient ||= Azure::Storage::Table::TableService.new(default_client(options))
    end

It is better to add below to introduce how to call blob/table/queue client after creating the default client in README

  require "azure/storage"

  # Setup a specific instance of an Azure::Storage::Client
  client = Azure::Storage::Client.create(:storage_account_name => "your account name", :storage_access_key => "your access key")

  blobs = client.blob_client
  queues = client.queue_client
  tables = client.table_client

Direct upload and `x-ms-blob-type` header

Hey!

I'm using azure for direct upload from rails app. My controller generates signed_url using Azure::Storage::Core::Auth::SharedAccessSignature#signed_url and returns it as a json response.
From my js I then upload file to that signed_url.

Problem:
My client side upload works works only if I specify x-ms-blob-type': 'BlockBlob in the header. Can I specify this value on the signer to be appended to the query string instead of manually adding header on the js side

My signer code

signer = Azure::Storage::Core::Auth::SharedAccessSignature.new(storage_account_name, storage_access_key)
signer.signed_uri(URI(base_url), false, permissions: "rw", service: "b", resource: "b", expiry: expires_in).to_s

is something like this possible

signer.signed_uri(URI(base_url), false, permissions: "rw", service: "b", resource: "b", blob_type: "BlockBlob", expiry: expires_in).to_s

list_blobs doesn't work at all when delimiter is specified

When list_blobs is called with a delimiter, e.g. /, it does not return any results despite the server returning a valid response.

This part in blob_enumeration_results_from_xml doesn't check for (xml > "Blobs") > "BlobPrefix" which is what the server returns when delimiter is specified.

If this is actually an issue, it's very surprising since this use case of list_blobs is extremely common and even the default in other client libraries. It's needed when trying to traverse a pseudo-directory structure in blob storage, otherwise, results are returned "flat" or "recursive," and with thousands of blobs, it's not usable.

See here the .NET client, where it uses a delimiter (/) by default (useFlatBlobListing is false by default).

I would submit a pull request (after signing the CLA) but don't know how you want the results of this returned, given that the results don't represent actual Blobs, just prefixes.

Connection reset by peer exception after getting blobs for a period of time

I have logic to keep reading blobs. Something like:

repeat the following code every 5 seconds

blob, content = @azure_blob.get_blob(@container, blob_name, {:start_range=>start_index} )

After the code ran for about 24 hours, I got the following exception:

["org/jruby/ext/openssl/SSLSocket.java:809:in `sysread_nonblock'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/shared/jopenssl19/openssl/buffering.rb:174:in `read_nonblock'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/protocol.rb:141:in `rbuf_fill'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/protocol.rb:92:in `read'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:2764:in `read_body_0'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:2719:in `read_body'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1052:in `get'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1331:in `transport_request'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:2680:in `reading_body'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:2679:in `reading_body'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1330:in `transport_request'", 
"org/jruby/RubyKernel.java:1242:in `catch'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1325:in `transport_request'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1302:in `request'", "
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1295:in `request'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:746:in `start'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:744:in `start'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1293:in `request'", 
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1035:in `get'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:80:in `perform_request'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:40:in `call'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:87:in `with_net_http_connection'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:32:in `call'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday_middleware-0.11.0.1/lib/faraday_middleware/response/follow_redirects.rb:78:in `perform_with_redirection'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday_middleware-0.11.0.1/lib/faraday_middleware/response/follow_redirects.rb:66:in `call'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/rack_builder.rb:139:in `build_response'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/connection.rb:377:in `run_request'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/http_response_helper.rb:27:in `set_up_response'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/http/http_request.rb:143:in `call'", 
"org/jruby/RubyMethod.java:116:in `call'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/http/retry_policy.rb:41:in `call'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/http/http_request.rb:104:in `with_filter'", 
"org/jruby/RubyMethod.java:116:in `call'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/http/signer_filter.rb:28:in `call'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/http/http_request.rb:104:in `with_filter'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/service.rb:36:in `call'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/filtered_service.rb:34:in `call'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/signed_service.rb:41:in `call'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-storage-0.11.4.preview/lib/azure/storage/service/storage_service.rb:52:in `call'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-storage-0.11.4.preview/lib/azure/storage/blob/blob_service.rb:59:in `call'", 
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-storage-**0.11.4.preview/lib/azure/storage/blob/blob.rb:91:in `get_blob'",** 
"/usr/share/logstash/vendor/local_gems/950703aa/logstash-input-azureblob-0.9.8/lib/logstash/inputs/azureblob.rb:100:in `process'", 
"/usr/share/logstash/vendor/local_gems/950703aa/logstash-input-azureblob-0.9.8/lib/logstash/inputs/azureblob.rb:95:in `process'", 
"/usr/share/logstash/vendor/local_gems/950703aa/logstash-input-azureblob-0.9.8/lib/logstash/inputs/azureblob.rb:76:in `run'", 
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:456:in `inputworker'", 
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:449:in `start_input'"] {:exception=>#<IOError: Connection reset by peer>}
[2017-07-23T23:57:12,818][ERROR][logstash.inputs.logstashinputazureblob] Oh My, An error occurred.

Is this a SDK bug? Or is there anything I should do as caller to flush the buffer or something? Please let me know if more info is needed.

copy_blob_from_uri should raise the response as an exception if response does not contain 'x-ms-copy-id' nor 'x-ms-copy-status'

copy_blob_from_uri should raise the response as an exception if response does not contain 'x-ms-copy-id' nor 'x-ms-copy-status'.
One of our customer has failed in copy_blob_from_uri but could not get any useful information because the caller got ['',''] as the result.

    def copy_blob_from_uri(destination_container, destination_blob, source_blob_uri, options={})
      ...
      response = call(:put, uri, nil, headers)
      return response.headers['x-ms-copy-id'], response.headers['x-ms-copy-status']
    end

copy_blob_from_uri

Fails to insert records to storage account table

I got an error when insert a record to storage account table randomly. Sometime it can be reproduced, sometime not.

Error log:

   Failed creating missing vms > etcd_z1/0 (e4f6733c-7c66-4f3a-8781-1e716ac3e117): insert_entity: #<RuntimeError: Xml is not a entry node.>
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.2.0/gems/azure-storage-0.10.2.preview/lib/azure/storage/service/serialization.rb:297:in `expect_node'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.2.0/gems/azure-storage-0.10.2.preview/lib/azure/storage/table/serialization.rb:102:in `hash_from_entry_xml'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.2.0/gems/azure-storage-0.10.2.preview/lib/azure/storage/table/table_service.rb:220:in `insert_entity'
/var/vcap/packages/bosh_azure_cpi/lib/cloud/azure/table_manager.rb:54:in `insert_entity'
/var/vcap/packages/bosh_azure_cpi/lib/cloud/azure/stemcell_manager.rb:114:in `handle_stemcell_in_different_storage_account'
/var/vcap/packages/bosh_azure_cpi/lib/cloud/azure/stemcell_manager.rb:70:in `has_stemcell?'
/var/vcap/packages/bosh_azure_cpi/lib/cloud/azure/cloud.rb:107:in `block in create_vm'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.2.0/gems/bosh_common-1.3262.4.0/lib/common/thread_formatter.rb:49:in `with_thread_name'
/var/vcap/packages/bosh_azure_cpi/lib/cloud/azure/cloud.rb:104:in `create_vm'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.2.0/gems/bosh_cpi-1.3262.4.0/lib/bosh/cpi/cli.rb:70:in `public_send'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.2.0/gems/bosh_cpi-1.3262.4.0/lib/bosh/cpi/cli.rb:70:in `run'
/var/vcap/packages/bosh_azure_cpi/bin/azure_cpi:35:in `<main>' (00:00:25)

list_blobs does not raise the exception but returns the exception when the container does not exist

When the container does not exist, list_blob should raise an exception. But in v0.10.2, it returns the exception.

https://github.com/Azure/azure-storage-ruby/blob/master/lib/azure/storage/blob/container.rb#L551

def list_blobs(name, options={})
      # Call
      response = call(:get, uri)

      # Result
      if response.success?
        Serialization.blob_enumeration_results_from_xml(response.body)
      else
        response.exception
      end
end

Connection time out to Azure blob storage

After the issue #334 was fixed, now it does not hang. But Connection time out is thrown.

Could you add retry for time out issue with setting max retry count to 10? Thanks.

Logs:
":"Rescued Unknown: Connection timed out - connect(2) for "13.68.167.248" port 443. backtrace: /var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/resolv-replace.rb:23:in initialize' /var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/resolv-replace.rb:23:ininitialize'
/var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/net/http.rb:879:in open' /var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/net/http.rb:879:inblock in connect'
/var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/timeout.rb:76:in timeout' /var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/net/http.rb:878:inconnect'
/var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/net/http.rb:863:in do_start' /var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/net/http.rb:852:instart'
/var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/net/http.rb:1369:in request' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:82:inperform_request'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:40:in block in call' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:87:inwith_net_http_connection'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:32:in call' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday_middleware-0.10.0/lib/faraday_middleware/response/follow_redirects.rb:76:inperform_with_redirection'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday_middleware-0.10.0/lib/faraday_middleware/response/follow_redirects.rb:64:in call' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday-0.9.2/lib/faraday/rack_builder.rb:139:inbuild_response'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday-0.9.2/lib/faraday/connection.rb:377:in run_request' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/http/http_request.rb:145:incall'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/http/signer_filter.rb:28:in call' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/http/signer_filter.rb:28:incall'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/http/http_request.rb:99:in block in with_filter' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/service.rb:36:incall'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/filtered_service.rb:34:in call' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/signed_service.rb:41:incall'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-0.7.4/lib/azure/blob/blob_service.rb:1385:in call' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-0.7.4/lib/azure/blob/blob_service.rb:785:inget_blob_properties'

Ruby SDK Will have the ability to create a blob from stream

Ruby SDK Will have the ability to create a blob from stream

blob_service.create_blob_from_stream(container,blob,stream), its present in Python SDK. Its reference can be checked at Create Blob Using Stream.

I couldn’t find similar API in Ruby SDK, project link is Git link .

Could you please let me know if the functionality already exists in Ruby SDK, if yes, how can we use it? Otherwise the timeline for the same feature in Ruby SDK.

Azure checks the MD5 hash when 'Content-MD5' is not specified.

According to the doc, Azure storage service only checks the hash that has arrived with the one that was sent when Content-MD5 is specified.
But when I tried to call create_block_blob with a header contains an invalid MD5 hash in x-ms-blob-content-md5 but does not contain Content-MD5, Azure returned below error.

<Code>Md5Mismatch</Code><Message>The MD5 value specified in the request did not match with the MD5 value calculated by the server. </Message>

Issue while listing page blob ranges

I am not being able to get the list method for page blob ranges to work. I am not sure if it's a really a bug or I am using this method incorrectly. Apologies if I am using this incorrectly

page_data = 512.times.map { [*'0'..'9', *'a'..'z'].sample }.join
blob_service.put_blob_pages(container_name, blob_name, 0, 511, page_data, {})
pages = blob_service.list_page_blob_ranges(container_name, blob_name, { :start_range => 0, :end_range => 511})

I expected to get at least one page with the data I just pushed, but I am not seeing the data in the first element of the returned array.

Cannot insert a record after querying record from a table (Azure v0.7.0)

Migrate the issue from the Azure SDK: Azure/azure-sdk-for-ruby#281

Always fail in inserting a record after querying records from a table.
The root cause is that the second requests uses the same Faraday connection with the first URI.

ARM returns below error:One of the request inputs is not valid.

The key is not correct in http_client.rb

def agents(uri)
  uri = URI.parse(uri) if uri.is_a?(String)
  key = uri.scheme.to_s + uri.host.to_s + uri.port.to_s
  @agents ||= {}
  unless @agents.key?(key)
    @agents[key] = build_http(uri)
  end
  @agents[key]
end

Unable to get full list Blobs in a container

I am trying to list all blobs in a storage account
But I am just getting few out of hundreds using list_blob

blobs = blob_service.list_blobs(container_name)

def blockblob_operations(blob_service)
   containers = blob_service.list_containers()
   containers.each do |container|
   puts "Container Name: #{container.name}"
   end
   # List all the blobs in the container
   puts 'List Blobs in Container'
   container_name = 'name'
   blobs = blob_service.list_blobs(container_name)
   blobs.each do |blob|
puts "#{blob.name} "
end

This only shows 6 blobs in the container but there are more than that
I tried using other options too
But got the same output

Illegal use of Content-Encoding

This client still conflating Content-Encoding with character-set encoding, here:

def call(method, uri, body=nil, headers={})

Valid values for Content-Encoding are, e.g. 'gzip', 'compress', etc. E.g.

Content-Encoding: gzip

The character-set encoding is found at the end of the Content-Type header, e.g.

Content-Type: application/json; charset=utf-8

My colleague @thomas-schreiter filed a PR against the old SDK here:

def call(method, uri, body=nil, headers={})
.

The subclass of RetryPolicyFilter overriding the method apply_retry_policy fails to retry by setting 'retry_data[:retryable] = true'

According to the comments of the method apply_retry_policy, the subclass of RetryPolicyFilter should override the method apply_retry_policy. If the apply_retry_policy of the subclass decides to retry by setting retry_data[:retryable] = true, the method should_retry? of RetryPolicyFilter will override this value in Line 46 so it fails to retry.

Suggest to change

    def should_retry?(response, retry_data)
      apply_retry_policy retry_data
      retry_data[:retryable] = retry_data[:count] <= @retry_count
      return false unless retry_data[:retryable]
      
      should_retry_on_local_error? retry_data
      should_retry_on_error? response, retry_data
      return false unless retry_data[:retryable]
      
      adjust_retry_parameter retry_data
    end

=>

    def should_retry?(response, retry_data)
      retry_data[:retryable] = false
      apply_retry_policy retry_data
      return false if retry_data[:count] > @retry_count
      
      retryable = retry_data[:retryable] || should_retry_on_local_error? retry_data || should_retry_on_error? response, retry_data
      return false unless retryable
      
      adjust_retry_parameter retry_data
    end

Using BlobService only with Endpoint & SAS-Token

Hi there.

I'm trying to figure out, how i would use the ruby sdk to upload a new object to the container.
I tried to use the create_from_connection_string method but i'll fail later when using the blob_client.

config = %{BlobEndpoint=https://#{someaccount}.blob.core.windows.net;SharedAccessSignature=sv=2015-12-11&ss=b&srt=o&sp=rw&se=2020-02-18T01:13:23Z&st=2017-02-17T17:13:23Z&spr=https&sig=1A5sp62Cx2ATRgNTzMVY2H73wjDdYdWxHXOPVw%2BMfEs%3D}
client = Azure::Storage::Client.create_from_connection_string(config)
blogs = client.blob_client

would crash with an exception:

/Users/magegu/.rvm/gems/ruby-2.3.1/gems/azure-core-0.1.7/lib/azure/core/auth/signer.rb:32:in `initialize': Signing key must be provided (ArgumentError)
	from /Users/magegu/.rvm/gems/ruby-2.3.1/gems/azure-core-0.1.7/lib/azure/core/auth/shared_key.rb:33:in `initialize'
	from /Users/magegu/.rvm/gems/ruby-2.3.1/gems/azure-storage-0.11.5.preview/lib/azure/storage/blob/blob_service.rb:43:in `new'

any suggestion how to solve this with the current SDK/API?

Thanks

azure-storage-ruby gem should support using nokogiri >= 1.7.1

nokogiri has recently been updated to address several CVEs in the libxml2 library: link.

azure-storage is locked to versions of nokogiri that conform to ~> 1.6.0, which prevents us from updating our nokogiri to the patched version.

Is there anything preventing azure-storage from using newer versions of nokogiri? We would like to update our azure-storage to a version that supports nokogiri 1.7.1.

PUBLIC NOTICE: Azure Storage Ruby Client Library Update

We are pleased to announce an update to our Ruby client library for Storage.
We received storage service feature support requests and requests to fix a few issues in the Azure SDK for Ruby. To address your requests, we are working towards supporting new features released on the service and bring the Ruby Storage library to parity with the newest endpoint. Some of these changes include AppendBlob support, File service support, replacing AtomPub with JSON in table, Account SAS, improved response parsing and many other small features and API updates.

To support the above, we will need to make some breaking changes. The first round of changes will be the largest and will bring the library up to the Feb 2015 service version. We will focus on the storage features now present in this repository and release as the azure-storage package on RubyGems. All of this work is being done in the dev branch and we will be maintaining ChangeLog and BreakingChanges files so you can see a summary of what’s happening. Note that this branch should not be considered stable!

After bringing the library up to the latest version, we will then begin adding client-side improvements. We are considering items such as built-in MD5 calculation, automatically following continuation tokens on list APIs, improved queue message encoding support, and many others, including improvements the community suggests. So please provide us your feedback on what you would like to see implemented in the Ruby Storage client library.

We’d love to hear from you!

Thanks,
The Azure Storage Team

get_blob_properties is hung when the vhd is being copied

I am copying a vhd from one subscription to the other.

x = copy_blob_from_uri( 'container', 'vhd_name', source_uri)

blob = blob_client.get_blob_properties('container', 'vhd_name')
The timeout I get is:

/Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:879:in `initialize': execution expired (Faraday::ConnectionFailed)
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:879:in `open'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:879:in `block in connect'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/timeout.rb:88:in `block in timeout'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/timeout.rb:98:in `call'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/timeout.rb:98:in `timeout'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:878:in `connect'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:863:in `do_start'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:852:in `start'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:1375:in `request'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:82:in `perform_request'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:40:in `block in call'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:87:in `with_net_http_connection'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:32:in `call'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday_middleware-0.11.0.1/lib/faraday_middleware/response/follow_redirects.rb:78:in `perform_with_redirection'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday_middleware-0.11.0.1/lib/faraday_middleware/response/follow_redirects.rb:66:in `call'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday-0.9.2/lib/faraday/rack_builder.rb:139:in `build_response'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday-0.9.2/lib/faraday/connection.rb:377:in `run_request'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/http_response_helper.rb:27:in `set_up_response'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/http/http_request.rb:143:in `call'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/http/signer_filter.rb:28:in `call'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/http/signer_filter.rb:28:in `call'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/http/http_request.rb:104:in `block in with_filter'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/service.rb:36:in `call'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/filtered_service.rb:34:in `call'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/signed_service.rb:41:in `call'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-storage-0.12.1.preview/lib/azure/storage/service/storage_service.rb:53:in `call'
	from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-storage-0.12.1.preview/lib/azure/storage/blob/blob_service.rb:59:in `call'

How to set content type while uploading a file to Azure storage?

I wrote a script to copy all the files from my S3 to Azure Storage. The upload process is complete but for all those files the content type is stored as text/plain (while actually the file is a PDF). How to set content type for the file while uploading?

And is there any way to bulk edit the content-type property for all the files?

No good way to use leases with blobs

Migrate from the Azure SDK: Azure/azure-sdk-for-ruby#189

From @schadr:
I am trying to use leases on blob to ensure that only one host can modify the content.

For that purpose I create a lease on a blob by using the
aquire_lease operation
but there is no way to provide the lease_id for calls such as create_blob_pages, so the only way to use leases is to ensure a blob does not get modified while reading from it.

Or am I missing something?

Set Content-Disposition (content_disposition) for direct download?

I'm having some difficulty figuring out how set the Content-Disposition in Ruby so that I can set a "friendly download filename". Upon retrieval of anything I upload, I only get the following non-standard header back (sample url: https://dcpreservation.blob.core.windows.net/production-master/bpl-dev_j6731378s ):

x-ms-meta-Content_disposition: attachment; filename=myfile.tif

The most complex example of setting this property via two ways on my end is:

metadata = { :content_disposition => 'attachment; filename=myfile.tif' }
options = { :metadata => metadata, :content_type => 'image/tif', :content_disposition => 'attachment; filename=myfile.tif' }
blob = azure_blob_service.create_block_blob(container.name, 'bpl-dev_j6731378s', content, options)

Am I doing something incorrect? Is there a separate way to set the Content-Disposition header for direct downloads that may be outside the scope of this Ruby library? Or is the only way to make a file available for download with a custom filename to have my application act as an intermediate layer?

Unable to use gem with Rails 4

Hi,
Rails version : 4.2
Created a new app, included the following line in my (blank) GemFile

gem 'azure-storage', '~> 0.10.0.preview'

Then went into rails console and tried the following:

Azure::Storage::Core::Auth::SharedAccessSignature.new('blah','blah_again')
=> NameError: uninitialized constant Azure::Storage::Core::Auth::SharedAccessSignature

I think you're using 'autoload' heavily. Any ideas on how to get around this issue?

Best Regards,
RR

create_block_blob function doesn't support IO stream

I tried using the ruby SDK and redirected the Linux IO stream to the ruby SDK API.

Example of what I are trying :
cat file | azure_blob_service.create_block_blob("firstcontainer","image-blob", $stdin)

I am getting below exception
/opt/chefdk/embedded/lib/ruby/gems/2.4.0/gems/stuartpreston-azure-sdk-for-ruby-0.7.2/lib/azure/core/service.rb:34:in call': undefined method encoding' for #<IO:> (NoMethodError)
Did you mean? set_encoding
from /opt/chefdk/embedded/lib/ruby/gems/2.4.0/gems/stuartpreston-azure-sdk-for-ruby-0.7.2/lib/azure/core/filtered_service.rb:33:in call' from /opt/chefdk/embedded/lib/ruby/gems/2.4.0/gems/stuartpreston-azure-sdk-for-ruby-0.7.2/lib/azure/core/signed_service.rb:39:in call'
from /opt/chefdk/embedded/lib/ruby/gems/2.4.0/gems/stuartpreston-azure-sdk-for-ruby-0.7.2/lib/azure/blob/blob_service.rb:608:in create_block_blob' from /Users/prateegu/Desktop/script.rb:10:in

'

I checked the /opt/chefdk/embedded/lib/ruby/gems/2.4.0/gems/stuartpreston-azure-sdk-for-ruby-0.7.2/lib/azure/core/service.rb file's 34 line and looks like there is not distinction between String and IO class stream object. Can you please get it checked and fixed?

Cannot use #create_block_blob method with IO/File object

The call to #encoding only exists on String like objects`

/home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/blob/blob_service.rb:50:in `call': undefined method `encoding' for #<File:/srv/releases/jenkins/debian/jenkins_2.29_all.deb> (NoMethodError)
        from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/blob/block.rb:114:in `create_block_blob'
        from blob-sync:44:in `block (2 levels) in <main>'
        from /home/tyler/.rvm/rubies/ruby-2.2.0/lib/ruby/2.2.0/set.rb:283:in `each_key'
        from /home/tyler/.rvm/rubies/ruby-2.2.0/lib/ruby/2.2.0/set.rb:283:in `each'
        from blob-sync:39:in `block in <main>'
        from blob-sync:11:in `each'
        from blob-sync:11:in `<main>'

pseudocode to fetch a blob through a SAS token

Hi guys, pardon my ineptness, but I'm having issues fetching a simple blob object through a SAS token.
If anyone can point me in the direction of documentation or assist with psuedocode, I would greatly appreciate it. I'm receiving a URL such as https://accountname.blob.core.windows.net/container/file?sv=2016-06-01&sr=b&sig=laskdjflkjsdfkjsadf&se=2017-03-08&sp=r

I can parse out the account name, the container, the file and the SAS key, but then I haven't been able to get them to work with the Ruby SDK.

Thanks in advance!

How do I setup client using SAS key

Is there any way I can setup client using SAS key instead of storage_access_key as below

client = Azure::Storage::Client.create(:storage_account_name => "your account name", :storage_access_key => "your access key")

[P0]: Need your help to support GermanCloud

We need to support GermanCloud in cloud foundry so we need your help to support GermanCloud in azure-storage-ruby. You can reference the endpoints for GermanCloud in this link

  new Environment({
    name: 'AzureGermanCloud',
    portalUrl: 'http://portal.microsoftazure.de/',
    publishingProfileUrl: 'https://manage.microsoftazure.de/publishsettings/index',
    managementEndpointUrl: 'https://management.core.cloudapi.de',
    resourceManagerEndpointUrl: 'https://management.microsoftazure.de',
    sqlManagementEndpointUrl: 'https://management.core.cloudapi.de:8443/',
    sqlServerHostnameSuffix: '.database.cloudapi.de',
    galleryEndpointUrl: 'https://gallery.cloudapi.de/',
    activeDirectoryEndpointUrl: 'https://login.microsoftonline.de',
    activeDirectoryResourceId: 'https://management.core.cloudapi.de/',
    activeDirectoryGraphResourceId: 'https://graph.cloudapi.de/',
    activeDirectoryGraphApiVersion: '2013-04-05',
    storageEndpointSuffix: '.core.cloudapi.de',
    keyVaultDnsSuffix: '.vault.microsoftazure.de',
    // TODO: add dns suffixes for the US government for datalake store and datalake analytics once they are defined.
    azureDataLakeStoreFileSystemEndpointSuffix: 'N/A',
    azureDataLakeAnalyticsCatalogAndJobEndpointSuffix: 'N/A'
  })

TableService.new does not work

Using 0.10.2, follow the readme and fail when calling TableService.new
Storage name and key were set correctly in env.

require "azure/storage"
tables = Azure::Storage::Table::TableService.new

The comments suggest that we need to setup, why not just include the code?

require "azure/storage"
Azure::Storage.setup
tables = Azure::Storage::Table::TableService.new

0.10.2.preview generates bad blob sas-url

I am using 0.10.1.preview for some time, tried to move to 0.10.2.preview, and found that the blob sas-url generated is different. The SAS token is shorter, and has a different &sv value (version date). The application that get the url fails to open the blob with this url.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.