We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi Team,
We were running the below terraform code it was running well but now we get the below error.
Error: Float64 Type Validation Error. --- Value %!s(*big.Float=0.8) cannot be represented as a 64-bit floating point.
> keys(yamldecode(file("./_files/ably_config.yaml"))["brave"].queues) [ "ably-channel-sink", "kafka-connect-ably", ] > yamldecode(file("./_files/ably_config.yaml"))["brave"].queues["ably-channel-sink"].ttl 60 > yamldecode(file("./_files/ably_config.yaml"))["brave"].queues["ably-channel-sink"].max_length 10000 > yamldecode(file("./_files/ably_config.yaml"))["brave"].queues["ably-channel-sink"].region "us-east-1-a" > exit
resource "ably_queue" "queue" { for_each = toset(keys(local.ably.queues)) app_id = data.vault_generic_secret.ably_app_id.data["id"] name = each.key ttl = local.ably.queues[each.key].ttl max_length = local.ably.queues[each.key].max_length region = local.ably.queues[each.key].region }
┆Issue is synchronized with this Jira Task by Unito
The text was updated successfully, but these errors were encountered:
Hi, have you found a solution about this issue ?
Sorry, something went wrong.
Hi @Bharathkumarraju @TessilimiTheo sorry for the slow reply.
We are aware of the issue and it's on our backlog to investigate. I assume it's still an issue for you?
No branches or pull requests
Hi Team,
We were running the below terraform code it was running well but now we get the below error.
Error: Float64 Type Validation Error. --- Value %!s(*big.Float=0.8) cannot be represented as a 64-bit floating point.
┆Issue is synchronized with this Jira Task by Unito
The text was updated successfully, but these errors were encountered: