tombuildsstuff / giovanni Goto Github PK
View Code? Open in Web Editor NEWAn alternative Azure Storage SDK for Go
License: Apache License 2.0
An alternative Azure Storage SDK for Go
License: Apache License 2.0
There is a new public preview feature for storage account: https://techcommunity.microsoft.com/t5/azure-storage-blog/public-preview-create-additional-5000-azure-storage-accounts/ba-p/3465466. It introduced a new property dnsEndpointType
, when it is set to AzureDnsZone
, the endpoints of the sub-service are changed from the form: myAccountname.[service type].[url]
to myAccountname.[dnszone].[service type].[url]
.
This needs a change in this SDK, mostly around: internal/endpoints/endpoints.go.
Related to hashicorp/terraform-provider-azurerm#17513
Currently the entire blob is uploaded in a single HTTP Request - it's possible to optimize this using the BlockList API.
Terraform used to achieve this using the Storage SDK in the Azure SDK for Go previously - however it's worth noting that approach uploaded the entire contents of the file multiple times; so that's got an implementation bug but shows the principal.
In either case it'd be good to add a helper which handles this so that it's possible for users to configure the required Parallelism for Block uploads
fileInfo.Size()
can be 0
so Line 25 will cause 0/0
which is
The result of a floating-point or complex division by zero is not specified beyond the IEEE-754 standard; whether a run-time panic occurs is implementation-specific.
Ref: https://go.dev/ref/spec#Floating_point_operators
The division did not cause panic but the value of chucks
is Inf
but panic with "makechan: size out of range"
in line 35.
giovanni/storage/2020-08-04/file/files/range_put_file.go
Lines 20 to 35 in 5e72bd2
Related Issue: hashicorp/terraform-provider-azurerm#24171
entitiesClient := entities.New()
entitiesClient.Client.Authorizer = storageAuth
// what is partition key and row key
input := entities.InsertEntityInput {
PartitionKey: "abcabc1",
RowKey: "123acb1",
MetaDataLevel: entities.NoMetaData,
Entity: map[string]interface{} {
"company": "Microsoft",
"department": "ADG",
},
}
query := entities.QueryEntitiesInput {
PartitionKey: "abcabc1",
RowKey: "123acb1",
MetaDataLevel: entities.NoMetaData,
}
if _, err := entitiesClient.Insert(ctx, accountName, tableName, input); err != nil {
fmt.Printf("Error creating Entity: %s", err)
}
qresult, err := entitiesClient.Query(ctx, accountName, tableName, query)
if err != nil {
fmt.Printf("Error querying Entity: %s", err)
}
fmt.Printf("%v\n %v\n", *(qresult.Response.Response), qresult.Entities)
return
The result is:
{200 OK 200 HTTP/1.1 1 1 map[Cache-Control:[no-cache] Content-Type:[application/json;odata=nometadata;streaming=true;charset=utf-8] Date:[Wed, 18 Aug 2021 07:43:05 GMT] Etag:[W/"datetime'2021-08-18T07%3A43%3A04.7865159Z'"] Server:[Windows-Azure-Table/1.0 Microsoft-HTTPAPI/2.0] X-Content-Type-Options:[nosniff] X-Ms-Request-Id:[471b80b6-b002-0002-5b04-947141000000] X-Ms-Version:[2020-08-04]] 0xc0000743c0 -1 [chunked] false false map[] 0xc000492100 0xc0000b6370}
[]
Since it's generally helpful to import all of the SDK Clients for a given API Version, we should look to introduce a Meta Client in the same way that hashicorp/go-azure-sdk
does (example)
Previously worked deployment code fails with plugin did not respond
`Stack trace from the terraform-provider-azurerm_v3.97.0_x5 plugin:
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x38 pc=0x46849fe]
goroutine 1309 [running]:
github.com/tombuildsstuff/giovanni/storage/2023-11-03/datalakestore/filesystems.Client.GetProperties({0xe895da0?}, {0x8f70db8, 0xc000c039d0}, {0xc0030e4660?, 0xc001bad548?})
github.com/tombuildsstuff/[email protected]/storage/2023-11-03/datalakestore/filesystems/properties_get.go:53 +0x21e
github.com/hashicorp/terraform-provider-azurerm/internal/services/storage.resourceStorageDataLakeGen2FileSystemCreate(0x0?, {0x77418e0?, 0xc0001a2000})
github.com/hashicorp/terraform-provider-azurerm/internal/services/storage/storage_data_lake_gen2_filesystem_resource.go:195 +0xb8c
github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).create(0x8f70d10?, {0x8f70d10?, 0xc002c431a0?}, 0xd?, {0x77418e0?, 0xc0001a2000?})
github.com/hashicorp/terraform-plugin-sdk/[email protected]/helper/schema/resource.go:766 +0x163
github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).Apply(0xc001cfaee0, {0x8f70d10, 0xc002c431a0}, 0xc001787ee0, 0xc000ce7000, {0x77418e0, 0xc0001a2000})
github.com/hashicorp/terraform-plugin-sdk/[email protected]/helper/schema/resource.go:909 +0xa89
github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*GRPCProviderServer).ApplyResourceChange(0xc000a26288, {0x8f70d10?, 0xc002c430b0?}, 0xc002083180)
github.com/hashicorp/terraform-plugin-sdk/[email protected]/helper/schema/grpc_provider.go:1060 +0xdbc
github.com/hashicorp/terraform-plugin-go/tfprotov5/tf5server.(*server).ApplyResourceChange(0xc001d80780, {0x8f70d10?, 0xc002c42750?}, 0xc000c031f0)
github.com/hashicorp/[email protected]/tfprotov5/tf5server/server.go:859 +0x56b
github.com/hashicorp/terraform-plugin-go/tfprotov5/internal/tfplugin5._Provider_ApplyResourceChange_Handler({0x83692e0?, 0xc001d80780}, {0x8f70d10, 0xc002c42750}, 0xc000c03180, 0x0)
github.com/hashicorp/[email protected]/tfprotov5/internal/tfplugin5/tfplugin5_grpc.pb.go:467 +0x169
google.golang.org/grpc.(*Server).processUnaryRPC(0xc0002c41e0, {0x8f98860, 0xc00163d520}, 0xc0020a1680, 0xc001d798f0, 0xe855c98, 0x0)
google.golang.org/[email protected]/server.go:1374 +0xde7
google.golang.org/grpc.(*Server).handleStream(0xc0002c41e0, {0x8f98860, 0xc00163d520}, 0xc0020a1680, 0x0)
google.golang.org/[email protected]/server.go:1751 +0x9e7
google.golang.org/grpc.(*Server).serveStreams.func1.1()
google.golang.org/[email protected]/server.go:986 +0xbb
created by google.golang.org/grpc.(*Server).serveStreams.func1 in goroutine 24
google.golang.org/[email protected]/server.go:997 +0x145
Error: The terraform-provider-azurerm_v3.97.0_x5 plugin crashed!
This is always indicative of a bug within the plugin. It would be immensely
helpful if you could report the crash with the plugin's maintainers so that it
can be fixed. The output above should help diagnose the issue.
Error: Terraform exited with code 1.
Error: Process completed with exit code 1.`
We have noticed that the files under last years date has been modified. My understading is that this not best practice, but rather create new folder with new code.
Please advise.
Add Github Actions/Travis to this repository
As a follow up to hashicorp/terraform-provider-azurerm#3939 and /hashicorp/terraform-provider-azurerm#3925. Currently when you use go-autorest and XML error is returned from Azure API, go-autorest fails to decode it, as it has hardcoded treating error messages as JSON:
I've tried some dummy fixes borrowed from https://github.com/Azure/azure-sdk-for-go/blob/master/storage/client.go#L977:
diff --git a/vendor/github.com/Azure/go-autorest/autorest/azure/azure.go b/vendor/github.com/Azure/go-autorest/autorest/azure/azure.go
index 3a0a439f..96f9a5ba 100644
--- a/vendor/github.com/Azure/go-autorest/autorest/azure/azure.go
+++ b/vendor/github.com/Azure/go-autorest/autorest/azure/azure.go
@@ -285,17 +285,24 @@ func WithErrorUnlessStatusCode(codes ...int) autorest.RespondDecorator {
var e RequestError
defer resp.Body.Close()
+ var encodedAs autorest.EncodedAs
+ if resp.Header.Get("Content-Type") == "application/xml" {
+ encodedAs = autorest.EncodedAsXML
+ } else {
+ encodedAs = autorest.EncodedAsJSON
+ }
+
// Copy and replace the Body in case it does not contain an error object.
// This will leave the Body available to the caller.
- b, decodeErr := autorest.CopyAndDecode(autorest.EncodedAsJSON, resp.Body, &e)
+ b, decodeErr := autorest.CopyAndDecode(encodedAs, resp.Body, &e)
but apparently more code needs to be changed to get it working.
If this is not a proper place for this issue, feel free to move it to other repository or point me where I should create it.
The ADLS Paths DeleteResponder
is expecting a 202
status code from the Delete API rather than the documented 200
response
Background: hashicorp/terraform-provider-azurerm#7521 (comment)
hashicorp/go-azure-sdk
includes a base client for Storage Data Plane - as such we should update the Base Layer to use this rather than Azure/go-autorest
, since go-autorest
is now end-of-life.
We're currently supporting API Version 2020-08-04
- we should look to introduce support for API Version 2023-11-03
, the delta's involved are:
2020-08-04
and updating the version)2020-08-04
and 2023-11-03
documented here: https://learn.microsoft.com/en-us/rest/api/storageservices/previous-azure-storage-service-versionsCurrently the SDK supports fetching blob properties, but I would like to suggest that it would also support fetching the blob contents. This would allow the Terraform AzureRM provider to add the .contents
to its attributes for the azurerm_storage_blob
data source.
Reference issue: hashicorp/terraform-provider-azurerm#18779
The auth library that this SDK relies on (within github.com/hashicorp/go-azure-helpers/authentication
) has been replaced by the new auth
and environments
packages within github.com/hashicorp/go-azure-sdk/sdk
) - as such we should look to update the existing Authorizers and Environments to take advantage of this.
So that we can break this refactoring up by Service, it'd be worth doing this in a backwards compatible way (that is, retrieving both the Environment from both the autorest
and environments
packages - which will allow #68 to be done gradually.
Whilst #74 updates the authentication layer to use hashicorp.go-azure-sdk
, it doesn't enable Request/Response logging - but we should add that.
This logic can be found in hashicorp/terraform-provider-azurerm
here: https://github.com/hashicorp/terraform-provider-azurerm/blob/dff425415de4b74f7f156c882cc8ed2b3977a05c/internal/common/middleware.go#L1
Looks like 2019-12-12
got released and is now the default: https://docs.microsoft.com/en-gb/rest/api/storageservices/versioning-for-the-azure-storage-services
It's possible to access Blobs using SAS Tokens - as such whilst it's only applicable to the Blobs and Container services we should add an AutoRest compatible authorizer for accessing Blobs/Containers using a SAS Token.
cc @mbfrahry
Hi
Not a bug report but a question. Does this project handle shared access signatures as a method of auth? I don't immediately see the option but wanted to check before I dig any deeper.
Thanks
Ken
thanks for creating this repo, this actually solve the problem of versioning apis called from terraform, but now the datalake storage is not supported, can we make it also here?
per hashicorp/terraform-provider-azurerm#4552
looks like we need to case between std and premium storage shares, where std allowed values are 1-5120, and premium allowed values are 100-102400
The Azure Storage API's expose the ability to configure Static Websites via the SetServiceProperties
call on the Blob Account: https://docs.microsoft.com/en-us/rest/api/storageservices/set-blob-service-properties (and retrieve it using GetServiceProperties)
Whilst these methods are exposed in the Azure SDK for Go - they're not complete:
as such it'd be worthwhile to add these to Giovanni so that we can configure Static Websites for Blob Storage Accounts which unblocks hashicorp/terraform-provider-azurerm#1903
From an implementation perspective this is presumably should be within a new accounts
package within the Blob package: https://github.com/tombuildsstuff/giovanni/tree/master/storage/2018-11-09/blob and just for these two methods
I ran into an error using terraform (described in hashicorp/terraform-provider-azurerm#10001) to manage some small files in azure storage, and I think the root cause is this guard:
I'm unsure why that guard is there, but if I remove that check I can upload small files and the other tests still pass. I didn't see anything in Put Range about a minimum range, maybe that was a constraint that has since been removed?
code :
Ref to :hashicorp/terraform-provider-azurerm#4782
While creating a storage share: the header value for "x-ms-share-quota" encounters an UTF-8 encoding bug.
Error: Error creating Share "bits" (Account "nancyctest1234" / Resource Group "nancyc-rg-1"): shares.Client#Create: Failure responding to request: StatusCode=400 -- Original Error: autorest/azure: error response cannot be parsed: "\ufeffInvalidHeaderValue
The value for one of the HTTP headers is not in the correct format.\nRequestId:ad94872b-101a-00c5-1ec1-9230c3000000\nTime:2019-11-04T03:36:16.4360584Zx-ms-share-quota50" error: invalid character 'ï' looking for beginning of value
It appears that Metadata (Tags) can now be case-insensitive, as such when #68 is completed we should be able to support this (by removing the canonicalisation)
Currently users of this library have to implement parallel uploads themselves - this SDK should expose a helper method to chunk the file and upload it in the specified number of chunks in parallel as requested.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.