azure / azure-storage-ruby Goto Github PK
View Code? Open in Web Editor NEWMicrosoft Azure Storage Library for Ruby
Home Page: http://azure.github.io/azure-storage-ruby/
Microsoft Azure Storage Library for Ruby
Home Page: http://azure.github.io/azure-storage-ruby/
Hi guys, pardon my ineptness, but I'm having issues fetching a simple blob object through a SAS token.
If anyone can point me in the direction of documentation or assist with psuedocode, I would greatly appreciate it. I'm receiving a URL such as https://accountname.blob.core.windows.net/container/file?sv=2016-06-01&sr=b&sig=laskdjflkjsdfkjsadf&se=2017-03-08&sp=r
I can parse out the account name, the container, the file and the SAS key, but then I haven't been able to get them to work with the Ruby SDK.
Thanks in advance!
When executing client = Azure::Storage.create(:storage_account_name => "###", :storage_access_key => "###")
receiving
Azure::Storage::InvalidOptionsError: options provided are not valid set: {}
Setup is working...
Thanks
-pankaj
Currently when the container does not exist, Container.get_container_properties return properties:
{:last_modified=>nil, :etag=>nil, :lease_status=>nil, :lease_state=>nil, :lease_duration=>nil}.
It should raise an exception like what it does in azure-sdk-for-ruby.
I tried using the ruby SDK and redirected the Linux IO stream to the ruby SDK API.
Example of what I are trying :
cat file | azure_blob_service.create_block_blob("firstcontainer","image-blob", $stdin)
I am getting below exception
/opt/chefdk/embedded/lib/ruby/gems/2.4.0/gems/stuartpreston-azure-sdk-for-ruby-0.7.2/lib/azure/core/service.rb:34:in call': undefined method
encoding' for #<IO:> (NoMethodError)
Did you mean? set_encoding
from /opt/chefdk/embedded/lib/ruby/gems/2.4.0/gems/stuartpreston-azure-sdk-for-ruby-0.7.2/lib/azure/core/filtered_service.rb:33:in call' from /opt/chefdk/embedded/lib/ruby/gems/2.4.0/gems/stuartpreston-azure-sdk-for-ruby-0.7.2/lib/azure/core/signed_service.rb:39:in
call'
from /opt/chefdk/embedded/lib/ruby/gems/2.4.0/gems/stuartpreston-azure-sdk-for-ruby-0.7.2/lib/azure/blob/blob_service.rb:608:in create_block_blob' from /Users/prateegu/Desktop/script.rb:10:in
I checked the /opt/chefdk/embedded/lib/ruby/gems/2.4.0/gems/stuartpreston-azure-sdk-for-ruby-0.7.2/lib/azure/core/service.rb file's 34 line and looks like there is not distinction between String and IO class stream object. Can you please get it checked and fixed?
set_blob_metadata, the blob can be tagged. Is there a way to supply tags as filter and fetch the blob(s) given the container_name
I am copying a vhd from one subscription to the other.
x = copy_blob_from_uri( 'container', 'vhd_name', source_uri)
blob = blob_client.get_blob_properties('container', 'vhd_name')
The timeout I get is:
/Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:879:in `initialize': execution expired (Faraday::ConnectionFailed)
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:879:in `open'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:879:in `block in connect'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/timeout.rb:88:in `block in timeout'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/timeout.rb:98:in `call'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/timeout.rb:98:in `timeout'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:878:in `connect'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:863:in `do_start'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:852:in `start'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/2.2.0/net/http.rb:1375:in `request'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:82:in `perform_request'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:40:in `block in call'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:87:in `with_net_http_connection'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:32:in `call'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday_middleware-0.11.0.1/lib/faraday_middleware/response/follow_redirects.rb:78:in `perform_with_redirection'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday_middleware-0.11.0.1/lib/faraday_middleware/response/follow_redirects.rb:66:in `call'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday-0.9.2/lib/faraday/rack_builder.rb:139:in `build_response'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/faraday-0.9.2/lib/faraday/connection.rb:377:in `run_request'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/http_response_helper.rb:27:in `set_up_response'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/http/http_request.rb:143:in `call'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/http/signer_filter.rb:28:in `call'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/http/signer_filter.rb:28:in `call'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/http/http_request.rb:104:in `block in with_filter'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/service.rb:36:in `call'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/filtered_service.rb:34:in `call'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-core-0.1.8/lib/azure/core/signed_service.rb:41:in `call'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-storage-0.12.1.preview/lib/azure/storage/service/storage_service.rb:53:in `call'
from /Users/john_doe/.rbenv/versions/2.2.3/lib/ruby/gems/2.2.0/gems/azure-storage-0.12.1.preview/lib/azure/storage/blob/blob_service.rb:59:in `call'
I'm having some difficulty figuring out how set the Content-Disposition in Ruby so that I can set a "friendly download filename". Upon retrieval of anything I upload, I only get the following non-standard header back (sample url: https://dcpreservation.blob.core.windows.net/production-master/bpl-dev_j6731378s ):
x-ms-meta-Content_disposition: attachment; filename=myfile.tif
The most complex example of setting this property via two ways on my end is:
metadata = { :content_disposition => 'attachment; filename=myfile.tif' }
options = { :metadata => metadata, :content_type => 'image/tif', :content_disposition => 'attachment; filename=myfile.tif' }
blob = azure_blob_service.create_block_blob(container.name, 'bpl-dev_j6731378s', content, options)
Am I doing something incorrect? Is there a separate way to set the Content-Disposition header for direct downloads that may be outside the scope of this Ruby library? Or is the only way to make a file available for download with a custom filename to have my application act as an intermediate layer?
In many cases, you get the generic error "one of the request inputs is not valid", which is returned by the REST API, but the API also returns a detail of which field or input is not valid. I am not seeing that being returned in the exception message.
This client still conflating Content-Encoding with character-set encoding, here:
Valid values for Content-Encoding are, e.g. 'gzip', 'compress', etc. E.g.
Content-Encoding: gzip
The character-set encoding is found at the end of the Content-Type header, e.g.
Content-Type: application/json; charset=utf-8
My colleague @thomas-schreiter filed a PR against the old SDK here:
.here is what I am doing
Azure::Storage::setup(:storage_account_name => @account_name,
:storage_access_key => @access_key)
@blob_service = Azure::Storage::Blob::BlobService.new
...
blob_factual,` content = @blob_service.get_blob(container_name, deconstructed[:blob_name], { :timeout => timeout })
File.open("#{deconstructed[:target_dir]}/#{deconstructed[:blob_name]}", "wb") { |f| f.write(content) }
I hit the following error:
WARN: Unable to download: 00094811-D064-41D1-9F98-5BF3C24F05FC due to argument error.
unknown encoding name - application/octet-stream
/usr/local/var/rbenv/versions/2.2.1/lib/ruby/gems/2.2.0/gems/azure-storage-0.11.0.preview/lib/azure/storage/blob/blob_service.rb:63:in force_encoding'\n/usr/local/var/rbenv/versions/2.2.1/lib/ruby/gems/2.2.0/gems/azure-storage-0.11.0.preview/lib/azure/storage/blob/blob_service.rb:63:in
call'\n/usr/local/var/rbenv/versions/2.2.1/lib/ruby/gems/2.2.0/gems/azure-storage-0.11.0.preview/lib/azure/storage/blob/blob.rb:89:in get_blob'\nmove_blobs_azure.rb:63:in
download_blob'\nmove_blobs_azure.rb:49:in block in get_blobs_from_container'\nmove_blobs_azure.rb:45:in
each'\nmove_blobs_azure.rb:45:in get_blobs_from_container'\nmove_blobs_azure.rb:138:in
block in
each'\nmove_blobs_azure.rb:135:in
'
it appears to be failing at a call to .force_encoding(...)
When the container does not exist, list_blob should raise an exception. But in v0.10.2, it returns the exception.
https://github.com/Azure/azure-storage-ruby/blob/master/lib/azure/storage/blob/container.rb#L551
def list_blobs(name, options={})
# Call
response = call(:get, uri)
# Result
if response.success?
Serialization.blob_enumeration_results_from_xml(response.body)
else
response.exception
end
end
The library has a hard dependency on the 1.x JSON gem, which breaks newer gems that are using 2.x JSON. Can the dependency be relaxed or removed from the gemspec?
Do you plan on integrating azure storage to fog ?
I am not being able to get the list method for page blob ranges to work. I am not sure if it's a really a bug or I am using this method incorrectly. Apologies if I am using this incorrectly
page_data = 512.times.map { [*'0'..'9', *'a'..'z'].sample }.join
blob_service.put_blob_pages(container_name, blob_name, 0, 511, page_data, {})
pages = blob_service.list_page_blob_ranges(container_name, blob_name, { :start_range => 0, :end_range => 511})
I expected to get at least one page with the data I just pushed, but I am not seeing the data in the first element of the returned array.
I have logic to keep reading blobs. Something like:
blob, content = @azure_blob.get_blob(@container, blob_name, {:start_range=>start_index} )
After the code ran for about 24 hours, I got the following exception:
["org/jruby/ext/openssl/SSLSocket.java:809:in `sysread_nonblock'",
"/usr/share/logstash/vendor/jruby/lib/ruby/shared/jopenssl19/openssl/buffering.rb:174:in `read_nonblock'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/protocol.rb:141:in `rbuf_fill'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/protocol.rb:92:in `read'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:2764:in `read_body_0'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:2719:in `read_body'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1052:in `get'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1331:in `transport_request'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:2680:in `reading_body'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:2679:in `reading_body'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1330:in `transport_request'",
"org/jruby/RubyKernel.java:1242:in `catch'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1325:in `transport_request'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1302:in `request'", "
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1295:in `request'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:746:in `start'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:744:in `start'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1293:in `request'",
"/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:1035:in `get'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:80:in `perform_request'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:40:in `call'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:87:in `with_net_http_connection'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:32:in `call'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday_middleware-0.11.0.1/lib/faraday_middleware/response/follow_redirects.rb:78:in `perform_with_redirection'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday_middleware-0.11.0.1/lib/faraday_middleware/response/follow_redirects.rb:66:in `call'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/rack_builder.rb:139:in `build_response'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/faraday-0.9.2/lib/faraday/connection.rb:377:in `run_request'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/http_response_helper.rb:27:in `set_up_response'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/http/http_request.rb:143:in `call'",
"org/jruby/RubyMethod.java:116:in `call'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/http/retry_policy.rb:41:in `call'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/http/http_request.rb:104:in `with_filter'",
"org/jruby/RubyMethod.java:116:in `call'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/http/signer_filter.rb:28:in `call'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/http/http_request.rb:104:in `with_filter'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/service.rb:36:in `call'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/filtered_service.rb:34:in `call'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-core-0.1.8/lib/azure/core/signed_service.rb:41:in `call'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-storage-0.11.4.preview/lib/azure/storage/service/storage_service.rb:52:in `call'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-storage-0.11.4.preview/lib/azure/storage/blob/blob_service.rb:59:in `call'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/azure-storage-**0.11.4.preview/lib/azure/storage/blob/blob.rb:91:in `get_blob'",**
"/usr/share/logstash/vendor/local_gems/950703aa/logstash-input-azureblob-0.9.8/lib/logstash/inputs/azureblob.rb:100:in `process'",
"/usr/share/logstash/vendor/local_gems/950703aa/logstash-input-azureblob-0.9.8/lib/logstash/inputs/azureblob.rb:95:in `process'",
"/usr/share/logstash/vendor/local_gems/950703aa/logstash-input-azureblob-0.9.8/lib/logstash/inputs/azureblob.rb:76:in `run'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:456:in `inputworker'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:449:in `start_input'"] {:exception=>#<IOError: Connection reset by peer>}
[2017-07-23T23:57:12,818][ERROR][logstash.inputs.logstashinputazureblob] Oh My, An error occurred.
Is this a SDK bug? Or is there anything I should do as caller to flush the buffer or something? Please let me know if more info is needed.
Migrate the issue from the Azure SDK: Azure/azure-sdk-for-ruby#281
Always fail in inserting a record after querying records from a table.
The root cause is that the second requests uses the same Faraday connection with the first URI.
ARM returns below error:One of the request inputs is not valid.
The key is not correct in http_client.rb
def agents(uri)
uri = URI.parse(uri) if uri.is_a?(String)
key = uri.scheme.to_s + uri.host.to_s + uri.port.to_s
@agents ||= {}
unless @agents.key?(key)
@agents[key] = build_http(uri)
end
@agents[key]
end
copy_blob_from_uri should raise the response as an exception if response does not contain 'x-ms-copy-id' nor 'x-ms-copy-status'.
One of our customer has failed in copy_blob_from_uri but could not get any useful information because the caller got ['',''] as the result.
def copy_blob_from_uri(destination_container, destination_blob, source_blob_uri, options={})
...
response = call(:put, uri, nil, headers)
return response.headers['x-ms-copy-id'], response.headers['x-ms-copy-status']
end
copy_blob_from_uri
Hi there.
I'm trying to figure out, how i would use the ruby sdk to upload a new object to the container.
I tried to use the create_from_connection_string method but i'll fail later when using the blob_client.
config = %{BlobEndpoint=https://#{someaccount}.blob.core.windows.net;SharedAccessSignature=sv=2015-12-11&ss=b&srt=o&sp=rw&se=2020-02-18T01:13:23Z&st=2017-02-17T17:13:23Z&spr=https&sig=1A5sp62Cx2ATRgNTzMVY2H73wjDdYdWxHXOPVw%2BMfEs%3D}
client = Azure::Storage::Client.create_from_connection_string(config)
blogs = client.blob_client
would crash with an exception:
/Users/magegu/.rvm/gems/ruby-2.3.1/gems/azure-core-0.1.7/lib/azure/core/auth/signer.rb:32:in `initialize': Signing key must be provided (ArgumentError)
from /Users/magegu/.rvm/gems/ruby-2.3.1/gems/azure-core-0.1.7/lib/azure/core/auth/shared_key.rb:33:in `initialize'
from /Users/magegu/.rvm/gems/ruby-2.3.1/gems/azure-storage-0.11.5.preview/lib/azure/storage/blob/blob_service.rb:43:in `new'
any suggestion how to solve this with the current SDK/API?
Thanks
When attempting to send a large (50MB+) file, things don't quite work :(
/home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/http/http_request.rb:147:in `call': RequestBodyTooLarge (413): The request body is too large and exceeds the maximum permissible limit. (Azure::Core::Http::HTTPError)
RequestId:7c1c2e38-0001-00ad-36b4-38a130000000
Time:2016-11-07T05:06:18.1887668Z
from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/http/signer_filter.rb:28:in `call'
from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/http/signer_filter.rb:28:in `call'
from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/http/http_request.rb:104:in `block in with_filter'
from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/service.rb:36:in `call'
from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/filtered_service.rb:34:in `call'
from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-core-0.1.5/lib/azure/core/signed_service.rb:41:in `call'
from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/service/storage_service.rb:52:in `call'
from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/blob/blob_service.rb:59:in `call'
from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/blob/block.rb:114:in `create_block_blob'
from blob-sync:50:in `block (2 levels) in <main>'
from /home/tyler/.rvm/rubies/ruby-2.2.0/lib/ruby/2.2.0/set.rb:283:in `each_key'
from /home/tyler/.rvm/rubies/ruby-2.2.0/lib/ruby/2.2.0/set.rb:283:in `each'
from blob-sync:45:in `block in <main>'
from blob-sync:17:in `each'
from blob-sync:17:in `<main>'
Please update dependencies to enable nokogiri security fix.
https://groups.google.com/forum/#!topic/ruby-security-ann/VZtBohEQ2iI
Hi,
Rails version : 4.2
Created a new app, included the following line in my (blank) GemFile
gem 'azure-storage', '~> 0.10.0.preview'
Then went into rails console and tried the following:
Azure::Storage::Core::Auth::SharedAccessSignature.new('blah','blah_again')
=> NameError: uninitialized constant Azure::Storage::Core::Auth::SharedAccessSignature
I think you're using 'autoload' heavily. Any ideas on how to get around this issue?
Best Regards,
RR
I wrote a script to copy all the files from my S3 to Azure Storage. The upload process is complete but for all those files the content type is stored as text/plain (while actually the file is a PDF). How to set content type for the file while uploading?
And is there any way to bulk edit the content-type property for all the files?
We need to support GermanCloud in cloud foundry so we need your help to support GermanCloud in azure-storage-ruby. You can reference the endpoints for GermanCloud in this link
new Environment({
name: 'AzureGermanCloud',
portalUrl: 'http://portal.microsoftazure.de/',
publishingProfileUrl: 'https://manage.microsoftazure.de/publishsettings/index',
managementEndpointUrl: 'https://management.core.cloudapi.de',
resourceManagerEndpointUrl: 'https://management.microsoftazure.de',
sqlManagementEndpointUrl: 'https://management.core.cloudapi.de:8443/',
sqlServerHostnameSuffix: '.database.cloudapi.de',
galleryEndpointUrl: 'https://gallery.cloudapi.de/',
activeDirectoryEndpointUrl: 'https://login.microsoftonline.de',
activeDirectoryResourceId: 'https://management.core.cloudapi.de/',
activeDirectoryGraphResourceId: 'https://graph.cloudapi.de/',
activeDirectoryGraphApiVersion: '2013-04-05',
storageEndpointSuffix: '.core.cloudapi.de',
keyVaultDnsSuffix: '.vault.microsoftazure.de',
// TODO: add dns suffixes for the US government for datalake store and datalake analytics once they are defined.
azureDataLakeStoreFileSystemEndpointSuffix: 'N/A',
azureDataLakeAnalyticsCatalogAndJobEndpointSuffix: 'N/A'
})
We are pleased to announce an update to our Ruby client library for Storage.
We received storage service feature support requests and requests to fix a few issues in the Azure SDK for Ruby. To address your requests, we are working towards supporting new features released on the service and bring the Ruby Storage library to parity with the newest endpoint. Some of these changes include AppendBlob support, File service support, replacing AtomPub with JSON in table, Account SAS, improved response parsing and many other small features and API updates.
To support the above, we will need to make some breaking changes. The first round of changes will be the largest and will bring the library up to the Feb 2015 service version. We will focus on the storage features now present in this repository and release as the azure-storage package on RubyGems. All of this work is being done in the dev branch and we will be maintaining ChangeLog and BreakingChanges files so you can see a summary of what’s happening. Note that this branch should not be considered stable!
After bringing the library up to the latest version, we will then begin adding client-side improvements. We are considering items such as built-in MD5 calculation, automatically following continuation tokens on list APIs, improved queue message encoding support, and many others, including improvements the community suggests. So please provide us your feedback on what you would like to see implemented in the Ruby Storage client library.
We’d love to hear from you!
Thanks,
The Azure Storage Team
I got an error when insert a record to storage account table randomly. Sometime it can be reproduced, sometime not.
Error log:
Failed creating missing vms > etcd_z1/0 (e4f6733c-7c66-4f3a-8781-1e716ac3e117): insert_entity: #<RuntimeError: Xml is not a entry node.>
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.2.0/gems/azure-storage-0.10.2.preview/lib/azure/storage/service/serialization.rb:297:in `expect_node'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.2.0/gems/azure-storage-0.10.2.preview/lib/azure/storage/table/serialization.rb:102:in `hash_from_entry_xml'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.2.0/gems/azure-storage-0.10.2.preview/lib/azure/storage/table/table_service.rb:220:in `insert_entity'
/var/vcap/packages/bosh_azure_cpi/lib/cloud/azure/table_manager.rb:54:in `insert_entity'
/var/vcap/packages/bosh_azure_cpi/lib/cloud/azure/stemcell_manager.rb:114:in `handle_stemcell_in_different_storage_account'
/var/vcap/packages/bosh_azure_cpi/lib/cloud/azure/stemcell_manager.rb:70:in `has_stemcell?'
/var/vcap/packages/bosh_azure_cpi/lib/cloud/azure/cloud.rb:107:in `block in create_vm'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.2.0/gems/bosh_common-1.3262.4.0/lib/common/thread_formatter.rb:49:in `with_thread_name'
/var/vcap/packages/bosh_azure_cpi/lib/cloud/azure/cloud.rb:104:in `create_vm'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.2.0/gems/bosh_cpi-1.3262.4.0/lib/bosh/cpi/cli.rb:70:in `public_send'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.2.0/gems/bosh_cpi-1.3262.4.0/lib/bosh/cpi/cli.rb:70:in `run'
/var/vcap/packages/bosh_azure_cpi/bin/azure_cpi:35:in `<main>' (00:00:25)
azure-storage-ruby 0.11.4 does not support to set 'User-Agent'. It always use the default 'User-Agent'.
azure-storage-ruby/lib/azure/storage/service/storage_service.rb
Lines 157 to 164 in 61c7227
def common_headers(options = {})
headers = {
'x-ms-version' => Azure::Storage::Default::STG_VERSION,
'User-Agent' => Azure::Storage::Default::USER_AGENT
}
headers.merge!({'x-ms-client-request-id' => options[:request_id]}) if options[:request_id]
headers
end
is there a way to create and attach managed disks to vm?
Migrate from the Azure SDK: Azure/azure-sdk-for-ruby#189
From @schadr:
I am trying to use leases on blob to ensure that only one host can modify the content.
For that purpose I create a lease on a blob by using the
aquire_lease operation
but there is no way to provide the lease_id for calls such as create_blob_pages, so the only way to use leases is to ensure a blob does not get modified while reading from it.
Or am I missing something?
Currently azure-storage is using mime-types (~> 2.0) gem version.
If we downgrade mime-types version to (=1.25.1), will there be any breaking changes?
I am not able to find much documentation on the matter and I am interested in getting the continuation token for larger queries to paginate. Is it available somehow? If so, how?
using azure lib 0.7.6
/Users/micahsmith/.rbenv/versions/2.2.2/lib/ruby/gems/2.2.0/gems/azure-core-0.1.5/lib/azure/core/auth/signer.rb:32:in `initialize': Signing key must be provided (Argume
ntError)
from /Users/micahsmith/.rbenv/versions/2.2.2/lib/ruby/gems/2.2.0/gems/azure-core-0.1.5/lib/azure/core/auth/shared_key.rb:33:in `initialize'
from /Users/micahsmith/.rbenv/versions/2.2.2/lib/ruby/gems/2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/blob/blob_service.rb:43:in `new'
from /Users/micahsmith/.rbenv/versions/2.2.2/lib/ruby/gems/2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/blob/blob_service.rb:43:in `initialize'
from upload.rb:12:in `new'
from upload.rb:12:in `<main>'
and the following code:
client = Azure::Storage::Client.create(:storage_account_name => accountName, :storage_access_key => storageKey)
## throws here
@blobs = Azure::Storage::Blob::BlobService.new
nokogiri
has recently been updated to address several CVEs in the libxml2
library: link.
azure-storage
is locked to versions of nokogiri
that conform to ~> 1.6.0, which prevents us from updating our nokogiri
to the patched version.
Is there anything preventing azure-storage
from using newer versions of nokogiri
? We would like to update our azure-storage
to a version that supports nokogiri
1.7.1.
Using 0.10.2, follow the readme and fail when calling TableService.new
Storage name and key were set correctly in env.
require "azure/storage"
tables = Azure::Storage::Table::TableService.new
The comments suggest that we need to setup, why not just include the code?
require "azure/storage"
Azure::Storage.setup
tables = Azure::Storage::Table::TableService.new
Mixed camel case naming in lib/azure/storage/client.rb
xxxClient => xxx_client
def blobClient(options = {})
@blobClient ||= Azure::Storage::Blob::BlobService.new(default_client(options))
end
# Azure Queue service client configured from this Azure Storage client instance
# @return [Azure::Storage::Queue::QueueService]
def queueClient(options = {})
@queueClient ||= Azure::Storage::Queue::QueueService.new(default_client(options))
end
# Azure Table service client configured from this Azure Storage client instance
# @return [Azure::Storage::Table::TableService]
def tableClient(options = {})
@tableClient ||= Azure::Storage::Table::TableService.new(default_client(options))
end
It is better to add below to introduce how to call blob/table/queue client after creating the default client in README
require "azure/storage"
# Setup a specific instance of an Azure::Storage::Client
client = Azure::Storage::Client.create(:storage_account_name => "your account name", :storage_access_key => "your access key")
blobs = client.blob_client
queues = client.queue_client
tables = client.table_client
Hi Everyone,
This is your chance to shape the Azure Storage product roadmap! We are looking for feedback from our users regarding Azure Storage Client Library and Tools. If you'd like to help us, please take the following short survey: http://aka.ms/ClientLibrariesFeedback2016
Sercan Guler
Program Manager, Azure Storage
Migrate the issue from the Azure SDK repo: Azure/azure-sdk-for-ruby#270
v0.7.0: blob_service::get_blob_properties returns a blob instead of blob_properties
https://github.com/Azure/azure-sdk-for-ruby/blob/v0.7.0/lib/azure/blob/blob_service.rb#L777-778
The call to #encoding
only exists on String
like objects`
/home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/blob/blob_service.rb:50:in `call': undefined method `encoding' for #<File:/srv/releases/jenkins/debian/jenkins_2.29_all.deb> (NoMethodError)
from /home/tyler/.rvm/gems/ruby-2.2.0/gems/azure-storage-0.11.3.preview/lib/azure/storage/blob/block.rb:114:in `create_block_blob'
from blob-sync:44:in `block (2 levels) in <main>'
from /home/tyler/.rvm/rubies/ruby-2.2.0/lib/ruby/2.2.0/set.rb:283:in `each_key'
from /home/tyler/.rvm/rubies/ruby-2.2.0/lib/ruby/2.2.0/set.rb:283:in `each'
from blob-sync:39:in `block in <main>'
from blob-sync:11:in `each'
from blob-sync:11:in `<main>'
According to the doc, Azure storage service only checks the hash that has arrived with the one that was sent when Content-MD5
is specified.
But when I tried to call create_block_blob
with a header contains an invalid MD5 hash in x-ms-blob-content-md5
but does not contain Content-MD5
, Azure returned below error.
<Code>Md5Mismatch</Code><Message>The MD5 value specified in the request did not match with the MD5 value calculated by the server. </Message>
I am using 0.10.1.preview for some time, tried to move to 0.10.2.preview, and found that the blob sas-url generated is different. The SAS token is shorter, and has a different &sv value (version date). The application that get the url fails to open the blob with this url.
When list_blobs
is called with a delimiter, e.g. /
, it does not return any results despite the server returning a valid response.
This part in blob_enumeration_results_from_xml doesn't check for (xml > "Blobs") > "BlobPrefix"
which is what the server returns when delimiter
is specified.
If this is actually an issue, it's very surprising since this use case of list_blobs
is extremely common and even the default in other client libraries. It's needed when trying to traverse a pseudo-directory structure in blob storage, otherwise, results are returned "flat" or "recursive," and with thousands of blobs, it's not usable.
See here the .NET client, where it uses a delimiter (/
) by default (useFlatBlobListing
is false by default).
I would submit a pull request (after signing the CLA) but don't know how you want the results of this returned, given that the results don't represent actual Blobs, just prefixes.
According to the comments of the method apply_retry_policy
, the subclass of RetryPolicyFilter
should override the method apply_retry_policy
. If the apply_retry_policy
of the subclass decides to retry by setting retry_data[:retryable] = true
, the method should_retry?
of RetryPolicyFilter
will override this value in Line 46 so it fails to retry.
Suggest to change
def should_retry?(response, retry_data)
apply_retry_policy retry_data
retry_data[:retryable] = retry_data[:count] <= @retry_count
return false unless retry_data[:retryable]
should_retry_on_local_error? retry_data
should_retry_on_error? response, retry_data
return false unless retry_data[:retryable]
adjust_retry_parameter retry_data
end
=>
def should_retry?(response, retry_data)
retry_data[:retryable] = false
apply_retry_policy retry_data
return false if retry_data[:count] > @retry_count
retryable = retry_data[:retryable] || should_retry_on_local_error? retry_data || should_retry_on_error? response, retry_data
return false unless retryable
adjust_retry_parameter retry_data
end
Hey!
I'm using azure for direct upload from rails app. My controller generates signed_url using Azure::Storage::Core::Auth::SharedAccessSignature#signed_url
and returns it as a json response.
From my js I then upload file to that signed_url.
Problem:
My client side upload works works only if I specify x-ms-blob-type': 'BlockBlob
in the header. Can I specify this value on the signer to be appended to the query string instead of manually adding header on the js side
My signer code
signer = Azure::Storage::Core::Auth::SharedAccessSignature.new(storage_account_name, storage_access_key)
signer.signed_uri(URI(base_url), false, permissions: "rw", service: "b", resource: "b", expiry: expires_in).to_s
is something like this possible
signer.signed_uri(URI(base_url), false, permissions: "rw", service: "b", resource: "b", blob_type: "BlockBlob", expiry: expires_in).to_s
Is there any way I can setup client using SAS key instead of storage_access_key as below
client = Azure::Storage::Client.create(:storage_account_name => "your account name", :storage_access_key => "your access key")
Ruby SDK Will have the ability to create a blob from stream
blob_service.create_blob_from_stream(container,blob,stream), its present in Python SDK. Its reference can be checked at Create Blob Using Stream.
I couldn’t find similar API in Ruby SDK, project link is Git link .
Could you please let me know if the functionality already exists in Ruby SDK, if yes, how can we use it? Otherwise the timeline for the same feature in Ruby SDK.
Could you please confirm the bug in azure-storage
which is described in the above issue?
Migrate from the Azure SDK: Azure/azure-sdk-for-ruby#212
.
As reported here: http://stackoverflow.com/questions/23567724/ruby-azure-blob-storage-requestbodytoolarge, SDK doesn't split files bigger than 64M for API supported upload.
Currently, solutions are available using different gems:
•Ruby: https://github.com/dmichael/azure-contrib
•UI: https://github.com/Sology/azure_direct_upload
This should be a part of SDK, imho.
I am trying to list all blobs in a storage account
But I am just getting few out of hundreds using list_blob
blobs = blob_service.list_blobs(container_name)
def blockblob_operations(blob_service)
containers = blob_service.list_containers()
containers.each do |container|
puts "Container Name: #{container.name}"
end
# List all the blobs in the container
puts 'List Blobs in Container'
container_name = 'name'
blobs = blob_service.list_blobs(container_name)
blobs.each do |blob|
puts "#{blob.name} "
end
This only shows 6 blobs in the container but there are more than that
I tried using other options too
But got the same output
The link in this line should be https://msdn.microsoft.com/en-us/library/azure/dd179352.aspx
blob_client.get_blob_properties(container_name, name, options)
When I tried above command for a non-exist blob, I got the result after a long time. But if I comment below code, I got the result very soon. It seems like that sdk retries when receiving 404.
blob_client.with_filter(Azure::Storage::Core::Filter::ExponentialRetryPolicyFilter.new)
After the issue #334 was fixed, now it does not hang. But Connection time out is thrown.
Could you add retry for time out issue with setting max retry count to 10? Thanks.
Logs:
":"Rescued Unknown: Connection timed out - connect(2) for "13.68.167.248" port 443. backtrace: /var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/resolv-replace.rb:23:in initialize' /var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/resolv-replace.rb:23:ininitialize'
/var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/net/http.rb:879:in open' /var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/net/http.rb:879:inblock in connect'
/var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/timeout.rb:76:in timeout' /var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/net/http.rb:878:inconnect'
/var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/net/http.rb:863:in do_start' /var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/net/http.rb:852:instart'
/var/vcap/packages/ruby_azure_cpi/lib/ruby/2.1.0/net/http.rb:1369:in request' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:82:inperform_request'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:40:in block in call' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:87:inwith_net_http_connection'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:32:in call' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday_middleware-0.10.0/lib/faraday_middleware/response/follow_redirects.rb:76:inperform_with_redirection'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday_middleware-0.10.0/lib/faraday_middleware/response/follow_redirects.rb:64:in call' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday-0.9.2/lib/faraday/rack_builder.rb:139:inbuild_response'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/faraday-0.9.2/lib/faraday/connection.rb:377:in run_request' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/http/http_request.rb:145:incall'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/http/signer_filter.rb:28:in call' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/http/signer_filter.rb:28:incall'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/http/http_request.rb:99:in block in with_filter' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/service.rb:36:incall'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/filtered_service.rb:34:in call' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-core-0.1.1/lib/azure/core/signed_service.rb:41:incall'
/var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-0.7.4/lib/azure/blob/blob_service.rb:1385:in call' /var/vcap/packages/bosh_azure_cpi/vendor/bundle/ruby/2.1.0/gems/azure-0.7.4/lib/azure/blob/blob_service.rb:785:inget_blob_properties'
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.