# allennlp.common.params¶

The Params class represents a dictionary of parameters (e.g. for configuring a model), with added functionality around logging and validation.

class allennlp.common.params.Params(params: Dict[str, Any], history: str = '', loading_from_archive: bool = False, files_to_archive: Dict[str, str] = None)[source]

Bases: collections.abc.MutableMapping

Represents a parameter dictionary with a history, and contains other functionality around parameter passing and validation for AllenNLP.

There are currently two benefits of a Params object over a plain dictionary for parameter passing:

1. We handle a few kinds of parameter validation, including making sure that parameters representing discrete choices actually have acceptable values, and making sure no extra parameters are passed.
2. We log all parameter reads, including default values. This gives a more complete specification of the actual parameters used than is given in a JSON file, because those may not specify what default values were used, whereas this will log them.

The convention for using a Params object in AllenNLP is that you will consume the parameters as you read them, so that there are none left when you’ve read everything you expect. This lets us easily validate that you didn’t pass in any extra parameters, just by making sure that the parameter dictionary is empty. You should do this when you’re done handling parameters, by calling Params.assert_empty().

DEFAULT = <object object>
add_file_to_archive(name: str) → None[source]

Any class in its from_params method can request that some of its input files be added to the archive by calling this method.

For example, if some class A had an input_file parameter, it could call

 params.add_file_to_archive("input_file") 

which would store the supplied value for input_file at the key previous.history.and.then.input_file. The files_to_archive dict is shared with child instances via the _check_is_dict method, so that the final mapping can be retrieved from the top-level Params object.

NOTE: You must call add_file_to_archive before you pop() the parameter, because the Params instance looks up the value of the filename inside itself.

If the loading_from_archive flag is True, this will be a no-op.

as_dict(quiet: bool = False, infer_type_and_cast: bool = False)[source]

Sometimes we need to just represent the parameters as a dict, for instance when we pass them to PyTorch code.

Parameters: quiet: bool, optional (default = False) Whether to log the parameters before returning them as a dict. infer_type_and_cast : bool, optional (default = False) If True, we infer types and cast (e.g. things that look like floats to floats).
as_flat_dict()[source]

Returns the parameters of a flat dictionary from keys to values. Nested structure is collapsed with periods.

as_ordered_dict(preference_orders: List[List[str]] = None) → collections.OrderedDict[source]

Returns Ordered Dict of Params from list of partial order preferences.

Parameters: preference_orders: List[List[str]], optional preference_orders is list of partial preference orders. [“A”, “B”, “C”] means “A” > “B” > “C”. For multiple preference_orders first will be considered first. Keys not found, will have last but alphabetical preference. Default Preferences: [["dataset_reader", "iterator", "model", "train_data_path", "validation_data_path", "test_data_path", "trainer", "vocabulary"], ["type"]]
assert_empty(class_name: str)[source]

Raises a ConfigurationError if self.params is not empty. We take class_name as an argument so that the error message gives some idea of where an error happened, if there was one. class_name should be the name of the calling class, the one that got extra parameters (if there are any).

duplicate() → allennlp.common.params.Params[source]

Uses copy.deepcopy() to create a duplicate (but fully distinct) copy of these Params.

static from_file(params_file: str, params_overrides: str = '', ext_vars: dict = None) → allennlp.common.params.Params[source]

Load a Params object from a configuration file.

Parameters: params_file : str The path to the configuration file to load. params_overrides : str, optional A dict of overrides that can be applied to final object. e.g. {“model.embedding_dim”: 10} ext_vars : dict, optional Our config files are Jsonnet, which allows specifying external variables for later substitution. Typically we substitute these using environment variables; however, you can also specify them here, in which case they take priority over environment variables. e.g. {“HOME_DIR”: “/Users/allennlp/home”}
get(key: str, default: Any = <object object>)[source]

Performs the functionality associated with dict.get(key) but also checks for returned dicts and returns a Params object in their place with an updated history.

pop(key: str, default: Any = <object object>) → Any[source]

Performs the functionality associated with dict.pop(key), along with checking for returned dictionaries, replacing them with Param objects with an updated history.

If key is not present in the dictionary, and no default was specified, we raise a ConfigurationError, instead of the typical KeyError.

pop_bool(key: str, default: Any = <object object>) → bool[source]

Performs a pop and coerces to a bool.

pop_choice(key: str, choices: List[Any], default_to_first_choice: bool = False) → Any[source]

Gets the value of key in the params dictionary, ensuring that the value is one of the given choices. Note that this pops the key from params, modifying the dictionary, consistent with how parameters are processed in this codebase.

Parameters: key: str Key to get the value from in the param dictionary choices: List[Any] A list of valid options for values corresponding to key. For example, if you’re specifying the type of encoder to use for some part of your model, the choices might be the list of encoder classes we know about and can instantiate. If the value we find in the param dictionary is not in choices, we raise a ConfigurationError, because the user specified an invalid value in their parameter file. default_to_first_choice: bool, optional (default=False) If this is True, we allow the key to not be present in the parameter dictionary. If the key is not present, we will use the return as the value the first choice in the choices list. If this is False, we raise a ConfigurationError, because specifying the key is required (e.g., you have to specify your model class when running an experiment, but you can feel free to use default settings for encoders if you want).
pop_float(key: str, default: Any = <object object>) → float[source]

Performs a pop and coerces to a float.

pop_int(key: str, default: Any = <object object>) → int[source]

Performs a pop and coerces to an int.

to_file(params_file: str, preference_orders: List[List[str]] = None) → None[source]
allennlp.common.params.infer_and_cast(value: Any)[source]

In some cases we’ll be feeding params dicts to functions we don’t own; for example, PyTorch optimizers. In that case we can’t use pop_int or similar to force casts (which means you can’t specify int parameters using environment variables). This function takes something that looks JSON-like and recursively casts things that look like (bool, int, float) to (bool, int, float).

allennlp.common.params.parse_overrides(serialized_overrides: str) → Dict[str, Any][source]
allennlp.common.params.pop_choice(params: Dict[str, Any], key: str, choices: List[Any], default_to_first_choice: bool = False, history: str = '?.') → Any[source]

Performs the same function as Params.pop_choice(), but is required in order to deal with places that the Params object is not welcome, such as inside Keras layers. See the docstring of that method for more detail on how this function works.

This method adds a history parameter, in the off-chance that you know it, so that we can reproduce Params.pop_choice() exactly. We default to using “?.” if you don’t know the history, so you’ll have to fix that in the log if you want to actually recover the logged parameters.

allennlp.common.params.unflatten(flat_dict: Dict[str, Any]) → Dict[str, Any][source]
Given a “flattened” dict with compound keys, e.g.
{“a.b”: 0}
unflatten it:
{“a”: {“b”: 0}}
allennlp.common.params.with_fallback(preferred: Dict[str, Any], fallback: Dict[str, Any]) → Dict[str, Any][source]

Deep merge two dicts, preferring values from preferred.