Databricks v1.87.0 published on Friday, Feb 20, 2026 by Pulumi
Databricks v1.87.0 published on Friday, Feb 20, 2026 by Pulumi
Using getFeatureEngineeringKafkaConfigs
Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.
function getFeatureEngineeringKafkaConfigs(args: GetFeatureEngineeringKafkaConfigsArgs, opts?: InvokeOptions): Promise<GetFeatureEngineeringKafkaConfigsResult>
function getFeatureEngineeringKafkaConfigsOutput(args: GetFeatureEngineeringKafkaConfigsOutputArgs, opts?: InvokeOptions): Output<GetFeatureEngineeringKafkaConfigsResult>def get_feature_engineering_kafka_configs(page_size: Optional[int] = None,
provider_config: Optional[GetFeatureEngineeringKafkaConfigsProviderConfig] = None,
opts: Optional[InvokeOptions] = None) -> GetFeatureEngineeringKafkaConfigsResult
def get_feature_engineering_kafka_configs_output(page_size: Optional[pulumi.Input[int]] = None,
provider_config: Optional[pulumi.Input[GetFeatureEngineeringKafkaConfigsProviderConfigArgs]] = None,
opts: Optional[InvokeOptions] = None) -> Output[GetFeatureEngineeringKafkaConfigsResult]func GetFeatureEngineeringKafkaConfigs(ctx *Context, args *GetFeatureEngineeringKafkaConfigsArgs, opts ...InvokeOption) (*GetFeatureEngineeringKafkaConfigsResult, error)
func GetFeatureEngineeringKafkaConfigsOutput(ctx *Context, args *GetFeatureEngineeringKafkaConfigsOutputArgs, opts ...InvokeOption) GetFeatureEngineeringKafkaConfigsResultOutput> Note: This function is named GetFeatureEngineeringKafkaConfigs in the Go SDK.
public static class GetFeatureEngineeringKafkaConfigs
{
public static Task<GetFeatureEngineeringKafkaConfigsResult> InvokeAsync(GetFeatureEngineeringKafkaConfigsArgs args, InvokeOptions? opts = null)
public static Output<GetFeatureEngineeringKafkaConfigsResult> Invoke(GetFeatureEngineeringKafkaConfigsInvokeArgs args, InvokeOptions? opts = null)
}public static CompletableFuture<GetFeatureEngineeringKafkaConfigsResult> getFeatureEngineeringKafkaConfigs(GetFeatureEngineeringKafkaConfigsArgs args, InvokeOptions options)
public static Output<GetFeatureEngineeringKafkaConfigsResult> getFeatureEngineeringKafkaConfigs(GetFeatureEngineeringKafkaConfigsArgs args, InvokeOptions options)
fn::invoke:
function: databricks:index/getFeatureEngineeringKafkaConfigs:getFeatureEngineeringKafkaConfigs
arguments:
# arguments dictionaryThe following arguments are supported:
- Page
Size int - The maximum number of results to return
- Provider
Config GetFeature Engineering Kafka Configs Provider Config - Configure the provider for management through account provider.
- Page
Size int - The maximum number of results to return
- Provider
Config GetFeature Engineering Kafka Configs Provider Config - Configure the provider for management through account provider.
- page
Size Integer - The maximum number of results to return
- provider
Config GetFeature Engineering Kafka Configs Provider Config - Configure the provider for management through account provider.
- page
Size number - The maximum number of results to return
- provider
Config GetFeature Engineering Kafka Configs Provider Config - Configure the provider for management through account provider.
- page_
size int - The maximum number of results to return
- provider_
config GetFeature Engineering Kafka Configs Provider Config - Configure the provider for management through account provider.
- page
Size Number - The maximum number of results to return
- provider
Config Property Map - Configure the provider for management through account provider.
getFeatureEngineeringKafkaConfigs Result
The following output properties are available:
- Id string
- The provider-assigned unique ID for this managed resource.
- Kafka
Configs List<GetFeature Engineering Kafka Configs Kafka Config> - Page
Size int - Provider
Config GetFeature Engineering Kafka Configs Provider Config
- Id string
- The provider-assigned unique ID for this managed resource.
- Kafka
Configs []GetFeature Engineering Kafka Configs Kafka Config - Page
Size int - Provider
Config GetFeature Engineering Kafka Configs Provider Config
- id String
- The provider-assigned unique ID for this managed resource.
- kafka
Configs List<GetFeature Engineering Kafka Configs Kafka Config> - page
Size Integer - provider
Config GetFeature Engineering Kafka Configs Provider Config
- id string
- The provider-assigned unique ID for this managed resource.
- kafka
Configs GetFeature Engineering Kafka Configs Kafka Config[] - page
Size number - provider
Config GetFeature Engineering Kafka Configs Provider Config
- id str
- The provider-assigned unique ID for this managed resource.
- kafka_
configs Sequence[GetFeature Engineering Kafka Configs Kafka Config] - page_
size int - provider_
config GetFeature Engineering Kafka Configs Provider Config
- id String
- The provider-assigned unique ID for this managed resource.
- kafka
Configs List<Property Map> - page
Size Number - provider
Config Property Map
Supporting Types
GetFeatureEngineeringKafkaConfigsKafkaConfig
- Auth
Config GetFeature Engineering Kafka Configs Kafka Config Auth Config - (AuthConfig) - Authentication configuration for connection to topics
- Backfill
Source GetFeature Engineering Kafka Configs Kafka Config Backfill Source - (BackfillSource) - A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
- Bootstrap
Servers string - (string) - A comma-separated list of host/port pairs pointing to Kafka cluster
- Extra
Options Dictionary<string, string> - (object) - Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
- Key
Schema GetFeature Engineering Kafka Configs Kafka Config Key Schema - (SchemaConfig) - Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
- Name string
- (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
- Subscription
Mode GetFeature Engineering Kafka Configs Kafka Config Subscription Mode - (SubscriptionMode) - Options to configure which Kafka topics to pull data from
- Value
Schema GetFeature Engineering Kafka Configs Kafka Config Value Schema - (SchemaConfig) - Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
- Provider
Config GetFeature Engineering Kafka Configs Kafka Config Provider Config - Configure the provider for management through account provider.
- Auth
Config GetFeature Engineering Kafka Configs Kafka Config Auth Config - (AuthConfig) - Authentication configuration for connection to topics
- Backfill
Source GetFeature Engineering Kafka Configs Kafka Config Backfill Source - (BackfillSource) - A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
- Bootstrap
Servers string - (string) - A comma-separated list of host/port pairs pointing to Kafka cluster
- Extra
Options map[string]string - (object) - Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
- Key
Schema GetFeature Engineering Kafka Configs Kafka Config Key Schema - (SchemaConfig) - Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
- Name string
- (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
- Subscription
Mode GetFeature Engineering Kafka Configs Kafka Config Subscription Mode - (SubscriptionMode) - Options to configure which Kafka topics to pull data from
- Value
Schema GetFeature Engineering Kafka Configs Kafka Config Value Schema - (SchemaConfig) - Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
- Provider
Config GetFeature Engineering Kafka Configs Kafka Config Provider Config - Configure the provider for management through account provider.
- auth
Config GetFeature Engineering Kafka Configs Kafka Config Auth Config - (AuthConfig) - Authentication configuration for connection to topics
- backfill
Source GetFeature Engineering Kafka Configs Kafka Config Backfill Source - (BackfillSource) - A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
- bootstrap
Servers String - (string) - A comma-separated list of host/port pairs pointing to Kafka cluster
- extra
Options Map<String,String> - (object) - Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
- key
Schema GetFeature Engineering Kafka Configs Kafka Config Key Schema - (SchemaConfig) - Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
- name String
- (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
- subscription
Mode GetFeature Engineering Kafka Configs Kafka Config Subscription Mode - (SubscriptionMode) - Options to configure which Kafka topics to pull data from
- value
Schema GetFeature Engineering Kafka Configs Kafka Config Value Schema - (SchemaConfig) - Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
- provider
Config GetFeature Engineering Kafka Configs Kafka Config Provider Config - Configure the provider for management through account provider.
- auth
Config GetFeature Engineering Kafka Configs Kafka Config Auth Config - (AuthConfig) - Authentication configuration for connection to topics
- backfill
Source GetFeature Engineering Kafka Configs Kafka Config Backfill Source - (BackfillSource) - A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
- bootstrap
Servers string - (string) - A comma-separated list of host/port pairs pointing to Kafka cluster
- extra
Options {[key: string]: string} - (object) - Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
- key
Schema GetFeature Engineering Kafka Configs Kafka Config Key Schema - (SchemaConfig) - Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
- name string
- (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
- subscription
Mode GetFeature Engineering Kafka Configs Kafka Config Subscription Mode - (SubscriptionMode) - Options to configure which Kafka topics to pull data from
- value
Schema GetFeature Engineering Kafka Configs Kafka Config Value Schema - (SchemaConfig) - Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
- provider
Config GetFeature Engineering Kafka Configs Kafka Config Provider Config - Configure the provider for management through account provider.
- auth_
config GetFeature Engineering Kafka Configs Kafka Config Auth Config - (AuthConfig) - Authentication configuration for connection to topics
- backfill_
source GetFeature Engineering Kafka Configs Kafka Config Backfill Source - (BackfillSource) - A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
- bootstrap_
servers str - (string) - A comma-separated list of host/port pairs pointing to Kafka cluster
- extra_
options Mapping[str, str] - (object) - Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
- key_
schema GetFeature Engineering Kafka Configs Kafka Config Key Schema - (SchemaConfig) - Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
- name str
- (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
- subscription_
mode GetFeature Engineering Kafka Configs Kafka Config Subscription Mode - (SubscriptionMode) - Options to configure which Kafka topics to pull data from
- value_
schema GetFeature Engineering Kafka Configs Kafka Config Value Schema - (SchemaConfig) - Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
- provider_
config GetFeature Engineering Kafka Configs Kafka Config Provider Config - Configure the provider for management through account provider.
- auth
Config Property Map - (AuthConfig) - Authentication configuration for connection to topics
- backfill
Source Property Map - (BackfillSource) - A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
- bootstrap
Servers String - (string) - A comma-separated list of host/port pairs pointing to Kafka cluster
- extra
Options Map<String> - (object) - Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
- key
Schema Property Map - (SchemaConfig) - Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
- name String
- (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
- subscription
Mode Property Map - (SubscriptionMode) - Options to configure which Kafka topics to pull data from
- value
Schema Property Map - (SchemaConfig) - Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
- provider
Config Property Map - Configure the provider for management through account provider.
GetFeatureEngineeringKafkaConfigsKafkaConfigAuthConfig
- Uc
Service stringCredential Name - (string) - Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
- Uc
Service stringCredential Name - (string) - Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
- uc
Service StringCredential Name - (string) - Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
- uc
Service stringCredential Name - (string) - Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
- uc_
service_ strcredential_ name - (string) - Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
- uc
Service StringCredential Name - (string) - Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSource
- Delta
Table GetSource Feature Engineering Kafka Configs Kafka Config Backfill Source Delta Table Source - (DeltaTableSource) - The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
- Delta
Table GetSource Feature Engineering Kafka Configs Kafka Config Backfill Source Delta Table Source - (DeltaTableSource) - The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
- delta
Table GetSource Feature Engineering Kafka Configs Kafka Config Backfill Source Delta Table Source - (DeltaTableSource) - The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
- delta
Table GetSource Feature Engineering Kafka Configs Kafka Config Backfill Source Delta Table Source - (DeltaTableSource) - The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
- delta_
table_ Getsource Feature Engineering Kafka Configs Kafka Config Backfill Source Delta Table Source - (DeltaTableSource) - The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
- delta
Table Property MapSource - (DeltaTableSource) - The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSourceDeltaTableSource
- Entity
Columns List<string> - (list of string) - The entity columns of the Delta table
- Full
Name string - (string) - The full three-part (catalog, schema, table) name of the Delta table
- Timeseries
Column string - (string) - The timeseries column of the Delta table
- Entity
Columns []string - (list of string) - The entity columns of the Delta table
- Full
Name string - (string) - The full three-part (catalog, schema, table) name of the Delta table
- Timeseries
Column string - (string) - The timeseries column of the Delta table
- entity
Columns List<String> - (list of string) - The entity columns of the Delta table
- full
Name String - (string) - The full three-part (catalog, schema, table) name of the Delta table
- timeseries
Column String - (string) - The timeseries column of the Delta table
- entity
Columns string[] - (list of string) - The entity columns of the Delta table
- full
Name string - (string) - The full three-part (catalog, schema, table) name of the Delta table
- timeseries
Column string - (string) - The timeseries column of the Delta table
- entity_
columns Sequence[str] - (list of string) - The entity columns of the Delta table
- full_
name str - (string) - The full three-part (catalog, schema, table) name of the Delta table
- timeseries_
column str - (string) - The timeseries column of the Delta table
- entity
Columns List<String> - (list of string) - The entity columns of the Delta table
- full
Name String - (string) - The full three-part (catalog, schema, table) name of the Delta table
- timeseries
Column String - (string) - The timeseries column of the Delta table
GetFeatureEngineeringKafkaConfigsKafkaConfigKeySchema
- Json
Schema string - (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
- Json
Schema string - (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
- json
Schema String - (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
- json
Schema string - (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
- json_
schema str - (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
- json
Schema String - (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
GetFeatureEngineeringKafkaConfigsKafkaConfigProviderConfig
- Workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- Workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id String - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace_
id str - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id String - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
GetFeatureEngineeringKafkaConfigsKafkaConfigSubscriptionMode
- Assign string
- (string) - A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
- Subscribe string
- (string) - A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
- Subscribe
Pattern string - (string) - A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
- Assign string
- (string) - A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
- Subscribe string
- (string) - A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
- Subscribe
Pattern string - (string) - A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
- assign String
- (string) - A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
- subscribe String
- (string) - A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
- subscribe
Pattern String - (string) - A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
- assign string
- (string) - A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
- subscribe string
- (string) - A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
- subscribe
Pattern string - (string) - A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
- assign str
- (string) - A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
- subscribe str
- (string) - A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
- subscribe_
pattern str - (string) - A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
- assign String
- (string) - A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
- subscribe String
- (string) - A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
- subscribe
Pattern String - (string) - A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
GetFeatureEngineeringKafkaConfigsKafkaConfigValueSchema
- Json
Schema string - (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
- Json
Schema string - (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
- json
Schema String - (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
- json
Schema string - (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
- json_
schema str - (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
- json
Schema String - (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
GetFeatureEngineeringKafkaConfigsProviderConfig
- Workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- Workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id String - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace_
id str - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id String - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
Package Details
- Repository
- databricks pulumi/pulumi-databricks
- License
- Apache-2.0
- Notes
- This Pulumi package is based on the
databricksTerraform Provider.
Databricks v1.87.0 published on Friday, Feb 20, 2026 by Pulumi
