Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added logging package and fixed issue with API calls not being shown in DEBUG or lower log levels. #2706

Closed
wants to merge 1 commit into from

Conversation

tanmay-db
Copy link
Contributor

@tanmay-db tanmay-db commented Sep 21, 2023

Changes

Terraform never had logging configured correctly (used to log all levels irrespective of log level specified). The issue (api calls not being shown in DEBUG log level) surfaced because in the TF version bump to 1.20.9 which had Go SDK version bump to 0.10.0, the PR: databricks/databricks-sdk-go#426 introduced INFO as basic log level leading to debug level never being enabled even if it is passed through the environment variable.

Tests

Unit tests added, Integration tests are running...

Also tested manually

 terraform {
  required_providers {
    databricks = {
      source  = "databricks/databricks"
      version = "1.26.0"
    }
  }
}

provider "databricks" {
    profile = <redacted>
}

data "databricks_catalogs" "all" {}

output "all_catalogs" {
  value = data.databricks_catalogs.all
}

shows the api calls

2023-09-21T03:03:41.340+0200 [DEBUG] provider.terraform-provider-databricks: GET /api/2.1/unity-catalog/catalogs
< HTTP/2.0

  • make test run locally
  • relevant change in docs/ folder
  • covered with integration tests in internal/acceptance
  • relevant acceptance tests are passing
  • using Go SDK

@tanmay-db tanmay-db requested review from a team as code owners September 21, 2023 07:53
@tanmay-db tanmay-db requested review from mgyucht and removed request for a team September 21, 2023 07:54
@tanmay-db tanmay-db changed the title Added logging for terraform provider databricks Added logging in terraform provider databricks Sep 21, 2023
@tanmay-db tanmay-db changed the title Added logging in terraform provider databricks Added logging package and fixed issue with API calls not being shown in DEBUG or lower log levels. Sep 21, 2023
return logger.Level(logger.LevelTrace)
} else {
// In all other cases we use INFO log level to be consistent with Go SDK
return logger.Level(logger.LevelInfo)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There seems to be another issue with the provider / how terraform provider is configured. If TF_LOG="" or is not set then INFO level isn't used.

This is true even without these changes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It works if TF_LOG=INFO is set

Copy link
Contributor Author

@tanmay-db tanmay-db Sep 21, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can try to investigate why this is happening but given it was happening before (not sure if it's expected behaviour) and can easily be mitigated with TF_LOG=info being set, we could go ahead with changes and file a bug fix for this to do later.

Copy link
Contributor

@mgyucht mgyucht left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While this would solve the immediate need, I don't think it's the right way to solve this problem. Instead, we should implement the Logger interface from the SDK using the tflog package and set the SDK logger to use that.


type realEnvOperations struct{}

func (r *realEnvOperations) Getenv(key string) string {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: for this and line 28, we probably should use value receivers rather than pointer receivers. There isn't really a reason to use a pointer receiver, and it categorically prevents nil-pointer errors from ever happening.

@@ -41,6 +42,7 @@ func init() {
// IMPORTANT: this line cannot be changed, because it's used for
// internal purposes at Databricks.
useragent.WithProduct("databricks-tf-provider", common.Version())
logging.SetLogger()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know that this is great to do in init(), since this assumes that the logging configuration is done fully statically by TF (and I'm a bit nervous to put things in init() that don't have to be there). Maybe we move this to ConfigureContextFunc.

}

func terraformToSDKLogLevel(envOps envOperations, logGetter logLevelGetter) logger.Level {
// Exception for unknown log levels are handled in underlying LogLevel() method
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting. This probably works fine given the current SDK logging architecture and gets us back to where we were, but I think it's a bit of a hack to try to go around TF logging infra and read from the environment like this. For instance, users can't enable debug logging for just our provider (otherwise, we would also need to check TF_LOG_PROVIDER_DATABRICKS as well).

Another approach would be to define a struct and implement the logging.Logger interface, then configure the SDK to use this logger. This is what is done in the AWS TF provider implementation. They implement a custom logger that implements their Logger interface and then configure the client to use that logger at configuration time. The advantage of this approach is that you let TF determine what gets printed to the log (so, for instance, TF could change the log level while the provider was running, or provide different ways for users to specify log level rather than through environment variables).

@tanmay-db
Copy link
Contributor Author

Thanks @mgyucht for review, closing this in favour of: #2747. This PR is still WIP, need to clean it up and add tests but it is working fine tested manually.

@tanmay-db tanmay-db closed this Sep 29, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants