honeycombio / terraform-provider-honeycombio Goto Github PK
View Code? Open in Web Editor NEWA Terraform provider for Honeycomb.io
Home Page: https://registry.terraform.io/providers/honeycombio/honeycombio/latest
License: MIT License
A Terraform provider for Honeycomb.io
Home Page: https://registry.terraform.io/providers/honeycombio/honeycombio/latest
License: MIT License
Hej,
following the CONTRIBUTION.md it is open to choose a dataset name which is unfortunately not working due to this line https://github.com/kvrhdn/terraform-provider-honeycombio/blob/0cc49010a4f1337c5e69bcc53614485ef457bb36/honeycombio/resource_dataset_test.go#L32
Adjusting the description should be enough?
New feature: you can specify the query_style
of a query on a board.
https://changelog.honeycomb.io/tables-now-available-in-boards-170910
The API has a new field query_style
: https://docs.honeycomb.io/api/boards-api/#fields-on-a-board
With the new honeycombio_query resource, it becomes confusing to have to differentiate between the honeycombio_query
data source and the resource.
The honeycombio_query
data source is not actually a query though, it's a query specification as used described in the docs.
To avoid this confusion we should:
honeycombio_query
data source to honeycombio_query_spec
query
to query_spec
My use case:
I want to use multiline strings in expression
field of a derived column.
Web UI is clearly not designed for this, I can apply a function only if I make it one-liner.
However terraform accepts it and I've encountered the problem when I applied.
After terraform apply
I accessed the "Derived Columns" page on the web ui
and got the blank white screen.
The problem
Expected behaviour
How to reproduce
Terraform provider allows applying multiline values
resource "honeycombio_derived_column" "tmp_action" {
alias = "tmp_action"
dataset = var.dataset
description = "SLI: successful action"
expression = <<EOT
IF(
REG_MATCH($request, `PATCH https://www.whatever.[a-z]+:443/gw/api/subscriptions/[0-9]+/bar?`),
AND(
OR(
STARTS_WITH($backend_status_code, "2"),
STARTS_WITH($backend_status_code, "3"),
STARTS_WITH($backend_status_code, "4")
),
LTE($backend_processing_time, 2)
)
)
EOT
}
terraform apply
succeeds and consequent terraform plan shows no changes. The column actually gets created. If I query the fetch_schema endpoint, I can see the column there. The newline characters are encoded as \n
and this, probably, breaks the web interface.
{
"id": "abc1234",
"alias": "tmp_action",
"expression": " IF(\n REG_MATCH($request, `PATCH https://www.whatever.[a-z]+:443/gw/api/subscriptions/[0-9]+/bar?`),\n AND(\n OR(\n STARTS_WITH($backend_status_code, \"2\"), \n STARTS_WITH($backend_status_code, \"3\"), \n STARTS_WITH($backend_status_code, \"4\")\n ),\n LTE($backend_processing_time, 2)\n )\n )\n",
"description": "SLI: successful action",
"creator": {
"created_at": "0001-01-01T00:00:00Z",
"updated_at": "0001-01-01T00:00:00Z",
"email": "",
"firstName": "",
"lastName": "",
"companyName": "",
"role": 0,
"receiveUsageEmails": false,
"id": "xxxxx"
},
When I access the "Derived Columns" page, I get the blank white screen.
In order to fix the page I delete the column via terraform.
Please let me know if the stripping of newline characters on the provider side would make sense in this case.
The Honeycomb standard has become HONEYCOMB_API_KEY
so the provider should support it too in the name of consistency.
This will deprecate but continue to support the current HONEYCOMBIO_APIKEY
but drop support altogether for the 1.0.0 release.
When adding a trigger with a query that has a calculation of op = "COUNT"
and no column it fails with the following error: 400 Bad Request: unknown column
A resource to manage API keys.
More information: no documentation available
API: there is no API yet for this
Possible Terraform config:
resource "honeycombio_api_key" "example" {
name = "Backend"
enable = true
permissions = ["send_events"]
}
# honeycombio_api_key exports the field `key`
The installation process for locally built providers changed with Terraform 0.13. We should document this as part of the contributing guide.
A resource to manage SLO's.
More information: https://docs.honeycomb.io/working-with-your-data/slos/
API: there is no API yet for this
Possible Terraform config:
resource "honeycombio_derived_column" "sli" {
# ...
}
resource "honeycombio_slo" "example" {
name = "API service"
description = "..."
column = honeycombio_derived_column.sli.alias
time_period = "28" // in days
target_percentage = 99.9
burn_alert {
exhaustion_time = 4
# use same structure as trigger recipients
recipient {
type = "pagerduty"
}
}
}
Currently, the provider client does not have support for the CONCURRENCY
calculation operator, and will return an error if you try to craft a query making use of it.
CONCURRENCY
is a unary operator similar to COUNT
and will need the same validation checking.
It's possible to set trigger recipients of type webhook using the UI. This isn't possible yet using the Terraform provider, though the API most likely allows this.
It's probably not possible to create webhooks directly (has to be done manually in the integration center), which would be similar to Slack recipients.
TODO:
type = "webhook"
in honeycombio_trigger
(update docs as well)type = "webhook"
in honeycombio_trigger_recipient
(update docs as well)We would like to see full support for graph and board settings in the provider.
Query related items:
Board related items:
Is your feature request related to a problem? Please describe.
Most/all of our boards are created via Terraform, and these are blockers for us atm.
Describe the solution you'd like
Support for the above.
Describe alternatives you've considered
We are currently creating boards by hand and planning to go back and migrate to terraform once these features (and others) land.
Additional context
None.
A resource to create boards
More information: https://docs.honeycomb.io/working-with-your-data/collaborating/boards/#docs-sidebar
API: https://docs.honeycomb.io/api/boards-api/#docs-sidebar
Edit 3 Aug: possible DSL for creating a board:
data "honeycombio_query" "query_1" {
# ...
}
data "honeycombio_query" "query_2" {
# ...
}
resource "honeycombio_board" "example" {
name = "Test board"
description = "This is an example."
style = "visual" # list or visual
query {
caption = "My first query"
dataset = "dataset-1"
query_json = data.honeycombio_query.query_1.json
}
query {
caption = "The same query but a different dataset"
dataset = "dataset-2"
query_json = data.honeycombio_query.query_1.json
}
query {
dataset = "dataset-1"
query_json = data.honeycombio_query.query_2.json
}
}
Two new filter ops have been released: in
and not-in
: https://changelog.honeycomb.io/in-and-not-in-added-to-query-builder-172740
Docs: https://docs.honeycomb.io/api/query-specification/#filter-operators
TODO
A resource to manage the derived columns of a dataset.
More information: https://docs.honeycomb.io/working-with-your-data/customizing-your-query/derived-columns/
Reference: https://docs.honeycomb.io/working-with-your-data/customizing-your-query/derived-columns/reference/#docs-sidebar
API: there is no API yet for this This is okay now :)
Possible Terraform schema:
resource "honeycombio_derived_column" "example" {
dataset = "my-dataset"
alias = "duration_ms_log10"
description = "Log10 of duration_ms"
function = "LOG10($\"duration_ms\")"
}
provider version = 0.0.7
Disclaimer: this might be a bad idea.
I've got the following terraform configuration that applies successfully and works as expected:
data "honeycombio_trigger_recipient" "slack" {
dataset = "my-very-special-datset"
type = "slack"
target = "#alerts"
}
resource "honeycombio_trigger" "trigger" {
//query omitted
recipient {
id = data.honeycombio_trigger_recipient.slack.id
}
}
However, I receive the following when I run a plan/apply
~ recipient {
id = "some-id"
- target = "#alerts" -> null
- type = "slack" -> null
}
It would be nice if that diff could be suppressed without duplicating the recipient target and type.
A resource to manage named queries and their description.
More information: once you've built a query in the UI, you can name this query and add a description. This makes it easier to share this query. Additionally, this title is shown on boards.
API: there is no API yet for this
Possible Terraform schema:
resource "honeycombio_query" "example" {
name = "My query"
description = "A description of this query. _Supports Markdown._"
dataset = "my-dataset"
query_json = data.honeycombio_query.my_query.json
}
To discuss:
data "honeycombio_query"
? Maybe rename the data source to honeycombio_query_spec
so it's clear this is only the specification of querycc. @fitzoh mentioned this in Slack
With the introduction of the new queries API, triggers and boards now link to their queries using a query_id
. To ease the transition the triggers and boards API will temporarily return both values (the original query
and the new query_id
).
We should refactor the Terraform logic so we don't rely on query
anymore and instead fetch the query specification using query_id
.
Add contributing.md to help out people that want to contribute.
Things we should discuss:
A data source that retrieves a list of all datasets present in the account.
This would be useful for creating standardized boards/triggers across all datasets.
Could potentially include the ability to filter based on the name (ie all datasets that end with -prod
).
Requires Honeycomb API support.
Add a pair of derived column data sources: honeycombio_derived_column and honeycombio_derived_columns
It's apparently already possible to generate documentation based upon the various Description
fields within the resource definitions.
To do:
Links
https://github.com/hashicorp/terraform-plugin-docs
https://github.com/hashicorp/terraform-provider-scaffolding/blob/main/main.go#L20
Currently, the provider client does not have support for havings
and will return an error if you try to craft a query making use of them.
See: https://docs.honeycomb.io/api/query-specification/#fields-on-a-query-specification
Hey!
Thanks for this operator!
While updating expression
value in honeycombio_derived_column
resource provider forces resource replacement while it's working fine in UI and API supports in-place updates: https://docs.honeycomb.io/api/derived-columns/#update-a-derived-column.
In our case deletion of derived_column is failing because it is referenced in SLOs.
https://github.com/kvrhdn/terraform-provider-honeycombio/blob/7ec7ca02cde626c5be91f5a7b8524afb29b55f3b/honeycombio/resource_derived_column.go#L29-L33
Is this a bug or there are reasons for this behavior (recreation of resource instead of in-place update)?
Apparently I've been using the original Terraform Plugin SDK, while there is a v2 version with refactored internals and improvements.
Since this is now the way to go we should probably adopt it as well (preferably before adding too many resources and data sources).
More information: https://www.terraform.io/docs/extend/guides/v2-upgrade-guide.html
Currently only possible to configure in the UI
Unblocked with the recent addition of the column_layout
field in the Boards API. Defaults to multi
but single
is also possible
Honeycomb has recently taken ownership of the Terraform Provider which was originally developed and maintained by community member Koenraad Verheyden (thank you! 🙏🏻 ).
Leading up to the first Honeycomb-owned release of the Provider there are three breaking changes we feel need to be made:
honeycombio_query
data source will be renamed to honeycombio_query_specification
. Initially raised in #65.honeycombio_board
resource will no longer support inline Query JSON and instead require that you create a honeycombio_query
resource and pass the ID in the list of queries on the Board. Initially raised in #55, and currently blocking #84 and #82.honeycombio_trigger
resource will no longer support inline Query JSON and instead require that you create a honeycombio_query
resource and pass the ID to the Trigger resource. Also raised in #55.As the move to the Honeycomb-maintained version of the provider will require that you update the namespace used in your required_providers
configuration from kvrhdn
to honeycombio
we hope that these changes will be more easily managed.
This informal RFC was written to offer transparency around these changes and give the provider's user community a chance to comment on the changes.
This datasource never allowed you to fetch an existing Query object, but instead helped you form a valid Query Specification in HCL. Renaming this to honeycombio_query_specification
seems most accurate and the change should easily be handled by a simple ‘find and replace’.
The Provider currently supports providing one or more inline queries (as Query Spec JSON query_json
) to a Board resource to create a Board. This was created before either of the Query API or Query Annotations API existed, but both customers and the field team are looking to be able to build Boards via Terraform made up of fully annotated queries managed in code.
Inline queries are problematic because providing the query JSON to the API makes a Query object for you which is then difficult to track in Terraform as it’s a “computed sometimes” attribute of the resource. The explicit creation of a Query resource also feels more declarative and ✨ Terraform-ish ✨ .
Boards with Terraform today look something like this:
resource “honeycombio_board” “myboard” {
name = "Test board managed by Terraform"
style = "list"
query {
caption = "test query"
dataset = var.dataset
query_json = data.honeycombio_query.test.json
}
query {
caption = "my verbose query"
query_style = "combo"
dataset = var.dataset
// Inline query
query_json = <<EOH
{
"time_range": 604800,
<truncated for brevity>
}
EOH
}
}
The current Board resource implementation ignores the returned query_id
completely.
The change to the Board resource’s schema will remove the ability to provide a Query Specification directly and instead require that you make use of the existing Query resource to construct Queries – with the recommendation that you do so with the assistance of the (newly renamed) Query Specification data source.
The above would turn into something like this:
resource “honeycombio_query” “helpful-query” {
dataset = var.dataset
query_json = data.honeycombio_query_specification.helpful.json
}
resource “honeycombio_query” “other-query” {
dataset = var.dataset
query_json = data.honeycombio_query_specification.other.json
}
resource “honeycombio_board” “myboard” {
name = "Test board managed by Terraform"
style = "list"
query {
caption = "A very Helpful Query"
dataset = var.dataset
query_id = honeycombio_query.helpful-query.id
}
query {
caption = "my verbose query"
query_style = "table"
dataset = var.dataset
query_id = honeycombio_query.other-query.id
}
}
The above paves the way for cleanly making use of the Query Annotation resource to name and describe the queries on the Board.
Similar to the Board resource, the Provider currently supports passing an inline Query (as Query Specification JSON) to create a Trigger. The explicit creation of a Query resource also feels more declarative and ✨ Terraform-ish ✨ .
This change is ultimately intended to bring more consistency to the Provider’s interface.
Triggers in Terraform today look something like this:
resource “honeycombio_trigger” “mytrigger” {
name = "Trigger managed by Terraform"
dataset = var.dataset
query_json = data.honeycombio_query.mytrigger.json
frequency = 300
threshold {
op = ">"
value = 50
}
recipient {
type = "email"
target = "[email protected]"
}
}
The change to the Trigger resource’s schema will remove the ability to provide a Query Specification directly and instead require that you make use of the existing Query resource to construct Queries – with the recommendation that you do so with the assistance of the (newly renamed) Query Specification data source.
The above would turn into something like this:
resource “honeycombio_query” “mytrigger” {
dataset = var.dataset
query_json = data.honeycombio_query_specification.mytrigger.json
}
resource “honeycombio_trigger” “mytrigger” {
name = "Trigger managed by Terraform"
dataset = var.dataset
query_id = honeycombio_query.mytrigger.id
frequency = 300
threshold {
op = ">"
value = 50
}
recipient {
type = "email"
target = "[email protected]"
}
}
Creating a board using this honeycombio_query
:
data "honeycombio_query" "db_heatmap" {
calculation {
op = "HEATMAP"
column = "db.duration"
}
filter {
column = "db.duration"
op = ">"
value = 500
}
}
(Note that the duration is quoted)
This results in no data being shown, while data is returned if the value is converted to a numeric value.
A resource to manage the columns of a dataset.
Every query I add to a board has Untitled Query
as a name. If I change it manually in the web-ui but re-run terraform apply
it reverts back. Is there a way to add a query_annotation
to a query that gets added to a board?
I was attempting to import a manually created board into terraform, but that does not appear to be documented.
From looking at the docs, no resources currently support importing.
A data source to refer to existing recipients from the integration center.
Both triggers and SLO's support specifying recipients. Specifying an email recipient is fairly straightforward, but to use Slack you have to use the recipient ID.
This data source would be a convenient way to get this ID.
Documentation
Psuedo Terraform code:
data "honeycombio_recipient" "slack" {
type = "slack"
target = "#alerts"
}
data "honeycombio_recipient" "pagerduty" {
type = "pagerduty"
}
resource "honeycombio_slo" "example" {
# ...
burn_alert {
# ...
recipients = [
honeycombio_recipient.slack.id,
honeycombio_recipient.pagerduty.id,
]
}
}
resource "honeycombio_trigger" "example" {
# ...
recipients = [
honeycombio_recipient.slack.id,
]
}
Ensure all provided examples (in example/
) are valid and apply correctly. Perhaps post-testsuite success?
Add a pair of column data sources: honeycombio_column
and honeycombio_columns
A resource to create markers.
More information: https://docs.honeycomb.io/working-with-your-data/customizing-your-query/markers/
Available API: https://docs.honeycomb.io/api/markers/
HCL code:
resource "honeycombio_marker" "marker" {
message = "Message"
type = "deploy"
url = "https://www.honeycomb.io/"
}
Running terraform plan
works successfully. But running terraform apply
crashes with Error: rpc error: code = Unavailable desc = transport is closing
Terraform v0.14.7
+ provider registry.terraform.io/kvrhdn/honeycombio v0.1.2
Below is the config in main.tf
terraform {
required_providers {
honeycombio = {
source = "kvrhdn/honeycombio"
version = "~> 0.1.2"
}
}
}
data "honeycombio_query" "query" {
calculation {
op = "COUNT"
}
time_range = 120
}
resource "honeycombio_trigger" "trigger" {
name = "test"
description = "test"
disabled = false
query_json = data.honeycombio_query.query.json
dataset = "ssherbondy-dev"
frequency = 120
threshold {
op = ">"
value = 300000
}
}
Below is the output from terraform apply
:
Error: rpc error: code = Unavailable desc = transport is closing
panic: runtime error: invalid memory address or nil pointer dereference
2021-02-18T22:13:20.426-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: [signal SIGSEGV: segmentation violation code=0x1 addr=0x38 pc=0x1695547]
2021-02-18T22:13:20.426-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2:
2021-02-18T22:13:20.426-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: goroutine 49 [running]:
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/kvrhdn/terraform-provider-honeycombio/honeycombio.resourceTriggerRead(0x1931800, 0xc0002c51a0, 0xc0002f2b80, 0x172cee0, 0xc000044ba0, 0xc0001aee00, 0xc0001aee70, 0x0)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/kvrhdn/terraform-provider-honeycombio/honeycombio/resource_trigger.go:162 +0x267
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/kvrhdn/terraform-provider-honeycombio/honeycombio.resourceTriggerCreate(0x1931800, 0xc0002c51a0, 0xc0002f2b80, 0x172cee0, 0xc000044ba0, 0xc0002feb90, 0x12c635a, 0xc0004a9200)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/kvrhdn/terraform-provider-honeycombio/honeycombio/resource_trigger.go:143 +0x295
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).create(0xc00039c6e0, 0x1931780, 0xc0002e2500, 0xc0002f2b80, 0x172cee0, 0xc000044ba0, 0x0, 0x0, 0x0)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/[email protected]/helper/schema/resource.go:276 +0x1ec
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).Apply(0xc00039c6e0, 0x1931780, 0xc0002e2500, 0xc0001ae380, 0xc0004a9200, 0x172cee0, 0xc000044ba0, 0x0, 0x0, 0x0, ...)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/[email protected]/helper/schema/resource.go:387 +0x681
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/v2/internal/helper/plugin.(*GRPCProviderServer).ApplyResourceChange(0xc000262200, 0x1931780, 0xc0002e2500, 0xc00013df80, 0xc000262200, 0xc000262210, 0x1849f90)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/[email protected]/internal/helper/plugin/grpc_provider.go:952 +0x8b2
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/v2/internal/tfplugin5._Provider_ApplyResourceChange_Handler.func1(0x1931780, 0xc0002e2500, 0x17dfc00, 0xc00013df80, 0xc0002e2500, 0x1763a00, 0xc0002c4c01, 0xc0004a8f40)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/[email protected]/internal/tfplugin5/tfplugin5.pb.go:3312 +0x86
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/v2/plugin.Serve.func3.1(0x1931840, 0xc0002e9050, 0x17dfc00, 0xc00013df80, 0xc0004a8f20, 0xc0004a8f40, 0xc000330ba0, 0x11b92e8, 0x17b8500, 0xc0002e9050)
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/[email protected]/plugin/serve.go:76 +0x87
2021-02-18T22:13:20.427-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/v2/internal/tfplugin5._Provider_ApplyResourceChange_Handler(0x17ee100, 0xc000262200, 0x1931840, 0xc0002e9050, 0xc0002c4cc0, 0xc000810a20, 0x1931840, 0xc0002e9050, 0xc0001f26c0, 0x223)
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: github.com/hashicorp/terraform-plugin-sdk/[email protected]/internal/tfplugin5/tfplugin5.pb.go:3314 +0x14b
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: google.golang.org/grpc.(*Server).processUnaryRPC(0xc0001fcfc0, 0x1939c80, 0xc000703e00, 0xc000310100, 0xc000712510, 0x1e30ce0, 0x0, 0x0, 0x0)
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: google.golang.org/[email protected]/server.go:1171 +0x50a
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: google.golang.org/grpc.(*Server).handleStream(0xc0001fcfc0, 0x1939c80, 0xc000703e00, 0xc000310100, 0x0)
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: google.golang.org/[email protected]/server.go:1494 +0xccd
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: google.golang.org/grpc.(*Server).serveStreams.func1.2(0xc0000b4310, 0xc0001fcfc0, 0x1939c80, 0xc000703e00, 0xc000310100)
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: google.golang.org/[email protected]/server.go:834 +0xa1
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: created by google.golang.org/grpc.(*Server).serveStreams.func1
2021-02-18T22:13:20.428-0500 [DEBUG] plugin.terraform-provider-honeycombio_v0.1.2: google.golang.org/[email protected]/server.go:832 +0x204
2021-02-18T22:13:20.428-0500 [WARN] plugin.stdio: received EOF, stopping recv loop: err="rpc error: code = Unavailable desc = transport is closing"
2021/02/18 22:13:20 [DEBUG] honeycombio_trigger.trigger: apply errored, but we're indicating that via the Error pointer rather than returning it: rpc error: code = Unavailable desc = transport is closing
2021/02/18 22:13:20 [TRACE] EvalMaybeTainted: honeycombio_trigger.trigger encountered an error during creation, so it is now marked as tainted
2021/02/18 22:13:20 [TRACE] EvalWriteState: removing state object for honeycombio_trigger.trigger
2021/02/18 22:13:20 [TRACE] EvalApplyProvisioners: honeycombio_trigger.trigger has no state, so skipping provisioners
2021/02/18 22:13:20 [TRACE] EvalMaybeTainted: honeycombio_trigger.trigger encountered an error during creation, so it is now marked as tainted
2021/02/18 22:13:20 [TRACE] EvalWriteState: removing state object for honeycombio_trigger.trigger
2021/02/18 22:13:20 [TRACE] vertex "honeycombio_trigger.trigger": visit complete
2021/02/18 22:13:20 [TRACE] dag/walk: upstream of "provider[\"registry.terraform.io/kvrhdn/honeycombio\"] (close)" errored, so skipping
2021/02/18 22:13:20 [TRACE] dag/walk: upstream of "meta.count-boundary (EachMode fixup)" errored, so skipping
2021/02/18 22:13:20 [TRACE] dag/walk: upstream of "root" errored, so skipping
2021-02-18T22:13:20.429-0500 [DEBUG] plugin: plugin process exited: path=.terraform/providers/registry.terraform.io/kvrhdn/honeycombio/0.1.2/darwin_amd64/terraform-provider-honeycombio_v0.1.2 pid=40062 error="exit status 2"
2021/02/18 22:13:20 [TRACE] statemgr.Filesystem: not making a backup, because the new snapshot is identical to the old
2021/02/18 22:13:20 [TRACE] statemgr.Filesystem: no state changes since last snapshot
2021/02/18 22:13:20 [TRACE] statemgr.Filesystem: writing snapshot at terraform.tfstate
2021/02/18 22:13:20 [TRACE] statemgr.Filesystem: removing lock metadata file .terraform.tfstate.lock.info
2021/02/18 22:13:20 [TRACE] statemgr.Filesystem: unlocking terraform.tfstate using fcntl flock
2021-02-18T22:13:20.446-0500 [DEBUG] plugin: plugin exited
When I review the list of triggers in Honeycomb, I can see that the trigger was successfully created as expected. It seems like there may be a failure on reading the trigger from the honeycomb API after it is created.
I attempted to create a query using the op
RATE_SUM
, but it appears that it is not on a whitelist in the provider.
Error: expected calculation.7.op to be one of [COUNT SUM AVG COUNT_DISTINCT MAX MIN P001 P01 P05 P10 P25 P50 P75 P90 P95 P99 P999 HEATMAP], got RATE_SUM
Versions
Steps to reproduce
Additional context
As @fitzoh proposed on the Honeycomb Slack: before running the acceptance tests, we could initialize a fresh dataset with all the expected columns. Contributors would only need to supply an API key to run all the tests.
This would avoid a lot of manual setup work and make the tests more deterministic. Currently, the tests will fail if the dataset is missing certain columns (i.e. duration_ms
, app.tenant
).
A resource to create triggers.
More information: https://docs.honeycomb.io/working-with-your-data/triggers/
API: https://docs.honeycomb.io/api/triggers/
Possible Terraform config (this is not final + probably not legal Terraform code):
resource "honeycombio_trigger" "trigger" {
name = "Requests are slower than usual"
description = "Average duration of all requests for the last 5 minutes."
# NOTE: maybe `enabled` would be simpler to grok?
disabled = false
# NOTE: should we prefer using terminology from the UI? I.e. `visualize`, `where`, `group by`
query {
# exactly one calculation is required
calculation {
op = "AVG"
column = "duration_ms"
}
# zero or more filter blocks
filter {
column = "trace.parent_id"
op = "does-not-exist"
}
# should also be supported: filter_combination, breakdowns
}
frequency = 300 // in seconds, 5 minutes
threshold {
op = ">"
value = 1000
}
# zero or more recipients
recipient {
type = "email"
target = "[email protected]"
}
recipient {
type = "pagerduty"
}
}
While trying to work around #17 I tried setting a column with my COUNT
column.
The terraform apply succeeded but upon trying to view it I received the following error:
This validation may belong at the API layer (where it is missing), but might be worth considering at the terraform layer as well
The latest release of go-honeycombio adds support for time_range
, granularity
, start_time
and end_time
to QuerySpec.
We should also add this to honeycombio_query
.
We are seeing query filters fail to validate if we need to specify anything other than a string as a comparison value: float, int, etc.
Versions
Latest version of the provider.
Steps to reproduce
Use something like the following in the query definition:
filter {
column = honeycombio_derived_column.<derived-column-name>.alias
op = ">"
value_float = 0.0
}
Additional context
None required.
Terraform SDK v2 introduce diag.Diagnostics
which are error messages with additional context (you can set the severity and configure an exact path).
Example: https://github.com/hashicorp/terraform-plugin-sdk/blob/master/helper/validation/map.go#L22
It would be interesting to start to introduce these were applicable.
Example of the output:
Currently, the API client is hard coded to assume https://api.honeycomb.io
, which is an excellent default but prevents usage in Honeycomb-internal environments or via mocks for testing.
Hey,
Would it be possible to have the provider not treat HTTP 409s as an error? When I try to run terraform apply with an empty state and a dataset with columns already exists, I get this error:
╷
│ Error: 409 Conflict: Can't insert, column already exists with this key_name or alias.
│
│ on columns.tf line 15, in resource "honeycombio_column" "column":
│ 15: resource "honeycombio_column" "column" {
│
╵
It'd be awesome if in this case it updated the existing column instead of having to manually import all of them.
Thanks!
A resource to manage datasets.
From Honeycomb Slack:
Automation and managing everything as IaC when possible 🙂 When an environment is created, the appropriate dataset is created that applications would use to send their data. Granted these will most likely never change or change every ice age. New engineers on the team are able to browse our code and understand how the system is composed
[source]
In preparation for releasing a v0.1.0, I'd like to do the following:
column
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.