1. Packages
  2. Databricks Provider
  3. API Docs
  4. getFeatureEngineeringKafkaConfigs
Databricks v1.87.0 published on Friday, Feb 20, 2026 by Pulumi
databricks logo
Databricks v1.87.0 published on Friday, Feb 20, 2026 by Pulumi

    Private Preview

    Using getFeatureEngineeringKafkaConfigs

    Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.

    function getFeatureEngineeringKafkaConfigs(args: GetFeatureEngineeringKafkaConfigsArgs, opts?: InvokeOptions): Promise<GetFeatureEngineeringKafkaConfigsResult>
    function getFeatureEngineeringKafkaConfigsOutput(args: GetFeatureEngineeringKafkaConfigsOutputArgs, opts?: InvokeOptions): Output<GetFeatureEngineeringKafkaConfigsResult>
    def get_feature_engineering_kafka_configs(page_size: Optional[int] = None,
                                              provider_config: Optional[GetFeatureEngineeringKafkaConfigsProviderConfig] = None,
                                              opts: Optional[InvokeOptions] = None) -> GetFeatureEngineeringKafkaConfigsResult
    def get_feature_engineering_kafka_configs_output(page_size: Optional[pulumi.Input[int]] = None,
                                              provider_config: Optional[pulumi.Input[GetFeatureEngineeringKafkaConfigsProviderConfigArgs]] = None,
                                              opts: Optional[InvokeOptions] = None) -> Output[GetFeatureEngineeringKafkaConfigsResult]
    func GetFeatureEngineeringKafkaConfigs(ctx *Context, args *GetFeatureEngineeringKafkaConfigsArgs, opts ...InvokeOption) (*GetFeatureEngineeringKafkaConfigsResult, error)
    func GetFeatureEngineeringKafkaConfigsOutput(ctx *Context, args *GetFeatureEngineeringKafkaConfigsOutputArgs, opts ...InvokeOption) GetFeatureEngineeringKafkaConfigsResultOutput

    > Note: This function is named GetFeatureEngineeringKafkaConfigs in the Go SDK.

    public static class GetFeatureEngineeringKafkaConfigs 
    {
        public static Task<GetFeatureEngineeringKafkaConfigsResult> InvokeAsync(GetFeatureEngineeringKafkaConfigsArgs args, InvokeOptions? opts = null)
        public static Output<GetFeatureEngineeringKafkaConfigsResult> Invoke(GetFeatureEngineeringKafkaConfigsInvokeArgs args, InvokeOptions? opts = null)
    }
    public static CompletableFuture<GetFeatureEngineeringKafkaConfigsResult> getFeatureEngineeringKafkaConfigs(GetFeatureEngineeringKafkaConfigsArgs args, InvokeOptions options)
    public static Output<GetFeatureEngineeringKafkaConfigsResult> getFeatureEngineeringKafkaConfigs(GetFeatureEngineeringKafkaConfigsArgs args, InvokeOptions options)
    
    fn::invoke:
      function: databricks:index/getFeatureEngineeringKafkaConfigs:getFeatureEngineeringKafkaConfigs
      arguments:
        # arguments dictionary

    The following arguments are supported:

    PageSize int
    The maximum number of results to return
    ProviderConfig GetFeatureEngineeringKafkaConfigsProviderConfig
    Configure the provider for management through account provider.
    PageSize int
    The maximum number of results to return
    ProviderConfig GetFeatureEngineeringKafkaConfigsProviderConfig
    Configure the provider for management through account provider.
    pageSize Integer
    The maximum number of results to return
    providerConfig GetFeatureEngineeringKafkaConfigsProviderConfig
    Configure the provider for management through account provider.
    pageSize number
    The maximum number of results to return
    providerConfig GetFeatureEngineeringKafkaConfigsProviderConfig
    Configure the provider for management through account provider.
    page_size int
    The maximum number of results to return
    provider_config GetFeatureEngineeringKafkaConfigsProviderConfig
    Configure the provider for management through account provider.
    pageSize Number
    The maximum number of results to return
    providerConfig Property Map
    Configure the provider for management through account provider.

    getFeatureEngineeringKafkaConfigs Result

    The following output properties are available:

    id String
    The provider-assigned unique ID for this managed resource.
    kafkaConfigs List<Property Map>
    pageSize Number
    providerConfig Property Map

    Supporting Types

    GetFeatureEngineeringKafkaConfigsKafkaConfig

    AuthConfig GetFeatureEngineeringKafkaConfigsKafkaConfigAuthConfig
    (AuthConfig) - Authentication configuration for connection to topics
    BackfillSource GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSource
    (BackfillSource) - A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    BootstrapServers string
    (string) - A comma-separated list of host/port pairs pointing to Kafka cluster
    ExtraOptions Dictionary<string, string>
    (object) - Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    KeySchema GetFeatureEngineeringKafkaConfigsKafkaConfigKeySchema
    (SchemaConfig) - Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    Name string
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    SubscriptionMode GetFeatureEngineeringKafkaConfigsKafkaConfigSubscriptionMode
    (SubscriptionMode) - Options to configure which Kafka topics to pull data from
    ValueSchema GetFeatureEngineeringKafkaConfigsKafkaConfigValueSchema
    (SchemaConfig) - Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    ProviderConfig GetFeatureEngineeringKafkaConfigsKafkaConfigProviderConfig
    Configure the provider for management through account provider.
    AuthConfig GetFeatureEngineeringKafkaConfigsKafkaConfigAuthConfig
    (AuthConfig) - Authentication configuration for connection to topics
    BackfillSource GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSource
    (BackfillSource) - A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    BootstrapServers string
    (string) - A comma-separated list of host/port pairs pointing to Kafka cluster
    ExtraOptions map[string]string
    (object) - Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    KeySchema GetFeatureEngineeringKafkaConfigsKafkaConfigKeySchema
    (SchemaConfig) - Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    Name string
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    SubscriptionMode GetFeatureEngineeringKafkaConfigsKafkaConfigSubscriptionMode
    (SubscriptionMode) - Options to configure which Kafka topics to pull data from
    ValueSchema GetFeatureEngineeringKafkaConfigsKafkaConfigValueSchema
    (SchemaConfig) - Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    ProviderConfig GetFeatureEngineeringKafkaConfigsKafkaConfigProviderConfig
    Configure the provider for management through account provider.
    authConfig GetFeatureEngineeringKafkaConfigsKafkaConfigAuthConfig
    (AuthConfig) - Authentication configuration for connection to topics
    backfillSource GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSource
    (BackfillSource) - A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    bootstrapServers String
    (string) - A comma-separated list of host/port pairs pointing to Kafka cluster
    extraOptions Map<String,String>
    (object) - Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    keySchema GetFeatureEngineeringKafkaConfigsKafkaConfigKeySchema
    (SchemaConfig) - Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    name String
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    subscriptionMode GetFeatureEngineeringKafkaConfigsKafkaConfigSubscriptionMode
    (SubscriptionMode) - Options to configure which Kafka topics to pull data from
    valueSchema GetFeatureEngineeringKafkaConfigsKafkaConfigValueSchema
    (SchemaConfig) - Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    providerConfig GetFeatureEngineeringKafkaConfigsKafkaConfigProviderConfig
    Configure the provider for management through account provider.
    authConfig GetFeatureEngineeringKafkaConfigsKafkaConfigAuthConfig
    (AuthConfig) - Authentication configuration for connection to topics
    backfillSource GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSource
    (BackfillSource) - A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    bootstrapServers string
    (string) - A comma-separated list of host/port pairs pointing to Kafka cluster
    extraOptions {[key: string]: string}
    (object) - Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    keySchema GetFeatureEngineeringKafkaConfigsKafkaConfigKeySchema
    (SchemaConfig) - Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    name string
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    subscriptionMode GetFeatureEngineeringKafkaConfigsKafkaConfigSubscriptionMode
    (SubscriptionMode) - Options to configure which Kafka topics to pull data from
    valueSchema GetFeatureEngineeringKafkaConfigsKafkaConfigValueSchema
    (SchemaConfig) - Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    providerConfig GetFeatureEngineeringKafkaConfigsKafkaConfigProviderConfig
    Configure the provider for management through account provider.
    auth_config GetFeatureEngineeringKafkaConfigsKafkaConfigAuthConfig
    (AuthConfig) - Authentication configuration for connection to topics
    backfill_source GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSource
    (BackfillSource) - A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    bootstrap_servers str
    (string) - A comma-separated list of host/port pairs pointing to Kafka cluster
    extra_options Mapping[str, str]
    (object) - Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    key_schema GetFeatureEngineeringKafkaConfigsKafkaConfigKeySchema
    (SchemaConfig) - Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    name str
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    subscription_mode GetFeatureEngineeringKafkaConfigsKafkaConfigSubscriptionMode
    (SubscriptionMode) - Options to configure which Kafka topics to pull data from
    value_schema GetFeatureEngineeringKafkaConfigsKafkaConfigValueSchema
    (SchemaConfig) - Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    provider_config GetFeatureEngineeringKafkaConfigsKafkaConfigProviderConfig
    Configure the provider for management through account provider.
    authConfig Property Map
    (AuthConfig) - Authentication configuration for connection to topics
    backfillSource Property Map
    (BackfillSource) - A user-provided and managed source for backfilling data. Historical data is used when creating a training set from streaming features linked to this Kafka config. In the future, a separate table will be maintained by Databricks for forward filling data. The schema for this source must match exactly that of the key and value schemas specified for this Kafka config
    bootstrapServers String
    (string) - A comma-separated list of host/port pairs pointing to Kafka cluster
    extraOptions Map<String>
    (object) - Catch-all for miscellaneous options. Keys should be source options or Kafka consumer options (kafka.*)
    keySchema Property Map
    (SchemaConfig) - Schema configuration for extracting message keys from topics. At least one of key_schema and value_schema must be provided
    name String
    (string) - Name that uniquely identifies this Kafka config within the metastore. This will be the identifier used from the Feature object to reference these configs for a feature. Can be distinct from topic name
    subscriptionMode Property Map
    (SubscriptionMode) - Options to configure which Kafka topics to pull data from
    valueSchema Property Map
    (SchemaConfig) - Schema configuration for extracting message values from topics. At least one of key_schema and value_schema must be provided
    providerConfig Property Map
    Configure the provider for management through account provider.

    GetFeatureEngineeringKafkaConfigsKafkaConfigAuthConfig

    UcServiceCredentialName string
    (string) - Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
    UcServiceCredentialName string
    (string) - Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
    ucServiceCredentialName String
    (string) - Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
    ucServiceCredentialName string
    (string) - Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
    uc_service_credential_name str
    (string) - Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential
    ucServiceCredentialName String
    (string) - Name of the Unity Catalog service credential. This value will be set under the option databricks.serviceCredential

    GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSource

    DeltaTableSource GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSourceDeltaTableSource
    (DeltaTableSource) - The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
    DeltaTableSource GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSourceDeltaTableSource
    (DeltaTableSource) - The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
    deltaTableSource GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSourceDeltaTableSource
    (DeltaTableSource) - The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
    deltaTableSource GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSourceDeltaTableSource
    (DeltaTableSource) - The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
    delta_table_source GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSourceDeltaTableSource
    (DeltaTableSource) - The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource
    deltaTableSource Property Map
    (DeltaTableSource) - The Delta table source containing the historic data to backfill. Only the delta table name is used for backfill, the entity columns and timeseries column are ignored as they are defined by the associated KafkaSource

    GetFeatureEngineeringKafkaConfigsKafkaConfigBackfillSourceDeltaTableSource

    EntityColumns List<string>
    (list of string) - The entity columns of the Delta table
    FullName string
    (string) - The full three-part (catalog, schema, table) name of the Delta table
    TimeseriesColumn string
    (string) - The timeseries column of the Delta table
    EntityColumns []string
    (list of string) - The entity columns of the Delta table
    FullName string
    (string) - The full three-part (catalog, schema, table) name of the Delta table
    TimeseriesColumn string
    (string) - The timeseries column of the Delta table
    entityColumns List<String>
    (list of string) - The entity columns of the Delta table
    fullName String
    (string) - The full three-part (catalog, schema, table) name of the Delta table
    timeseriesColumn String
    (string) - The timeseries column of the Delta table
    entityColumns string[]
    (list of string) - The entity columns of the Delta table
    fullName string
    (string) - The full three-part (catalog, schema, table) name of the Delta table
    timeseriesColumn string
    (string) - The timeseries column of the Delta table
    entity_columns Sequence[str]
    (list of string) - The entity columns of the Delta table
    full_name str
    (string) - The full three-part (catalog, schema, table) name of the Delta table
    timeseries_column str
    (string) - The timeseries column of the Delta table
    entityColumns List<String>
    (list of string) - The entity columns of the Delta table
    fullName String
    (string) - The full three-part (catalog, schema, table) name of the Delta table
    timeseriesColumn String
    (string) - The timeseries column of the Delta table

    GetFeatureEngineeringKafkaConfigsKafkaConfigKeySchema

    JsonSchema string
    (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    JsonSchema string
    (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    jsonSchema String
    (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    jsonSchema string
    (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    json_schema str
    (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    jsonSchema String
    (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)

    GetFeatureEngineeringKafkaConfigsKafkaConfigProviderConfig

    WorkspaceId string
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    WorkspaceId string
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspaceId String
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspaceId string
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspace_id str
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspaceId String
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.

    GetFeatureEngineeringKafkaConfigsKafkaConfigSubscriptionMode

    Assign string
    (string) - A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
    Subscribe string
    (string) - A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
    SubscribePattern string
    (string) - A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
    Assign string
    (string) - A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
    Subscribe string
    (string) - A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
    SubscribePattern string
    (string) - A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
    assign String
    (string) - A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
    subscribe String
    (string) - A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
    subscribePattern String
    (string) - A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
    assign string
    (string) - A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
    subscribe string
    (string) - A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
    subscribePattern string
    (string) - A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
    assign str
    (string) - A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
    subscribe str
    (string) - A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
    subscribe_pattern str
    (string) - A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'
    assign String
    (string) - A JSON string that contains the specific topic-partitions to consume from. For example, for '{"topicA":[0,1],"topicB":[2,4]}', topicA's 0'th and 1st partitions will be consumed from
    subscribe String
    (string) - A comma-separated list of Kafka topics to read from. For example, 'topicA,topicB,topicC'
    subscribePattern String
    (string) - A regular expression matching topics to subscribe to. For example, 'topic.*' will subscribe to all topics starting with 'topic'

    GetFeatureEngineeringKafkaConfigsKafkaConfigValueSchema

    JsonSchema string
    (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    JsonSchema string
    (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    jsonSchema String
    (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    jsonSchema string
    (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    json_schema str
    (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)
    jsonSchema String
    (string) - Schema of the JSON object in standard IETF JSON schema format (https://json-schema.org/)

    GetFeatureEngineeringKafkaConfigsProviderConfig

    WorkspaceId string
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    WorkspaceId string
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspaceId String
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspaceId string
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspace_id str
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
    workspaceId String
    Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.

    Package Details

    Repository
    databricks pulumi/pulumi-databricks
    License
    Apache-2.0
    Notes
    This Pulumi package is based on the databricks Terraform Provider.
    databricks logo
    Databricks v1.87.0 published on Friday, Feb 20, 2026 by Pulumi
      Meet Neo: Your AI Platform Teammate