XTRConfig

class lightning_ir.models.xtr.config.XTRConfig(token_retrieval_k: int | None = None, fill_strategy: Literal['zero', 'min'] = 'zero', normalization: Literal['Z'] | None = 'Z', **kwargs)[source]

Bases: ColConfig

__init__(token_retrieval_k: int | None = None, fill_strategy: Literal['zero', 'min'] = 'zero', normalization: Literal['Z'] | None = 'Z', **kwargs) None[source]

Methods

__init__([token_retrieval_k, fill_strategy, ...])

from_dict(config_dict, *args, **kwargs)

Loads the configuration from a dictionary.

from_pretrained(...)

get_config_dict(...)

save_pretrained(save_directory[, push_to_hub])

to_added_args_dict()

Outputs a dictionary of the added arguments.

to_dict()

to_tokenizer_dict()

Outputs a dictionary of the tokenizer arguments.

Attributes

ADDED_ARGS

Arguments added to the configuration.

TOKENIZER_ARGS

Arguments for the tokenizer.

backbone_model_type

Backbone model type for the configuration.

model_type

Model type for the configuration.

classmethod from_dict(config_dict: Dict[str, Any], *args, **kwargs) LightningIRConfig

Loads the configuration from a dictionary. Wraps the transformers.PretrainedConfig.from_dict method to return a derived LightningIRConfig class. See LightningIRConfigClassFactory for more details.

Parameters:

config_dict (Dict[str, Any]) – Configuration dictionary

Raises:

ValueError – If the model type does not match the configuration model type

Returns:

Derived LightningIRConfig class

Return type:

LightningIRConfig

to_added_args_dict() Dict[str, Any]

Outputs a dictionary of the added arguments.

Returns:

Added arguments

Return type:

Dict[str, Any]

to_tokenizer_dict() Dict[str, Any]

Outputs a dictionary of the tokenizer arguments.

Returns:

Tokenizer arguments

Return type:

Dict[str, Any]