Helper functions for archiving models and restoring archived models.

class allennlp.models.archival.Archive(model, config)[source]

Bases: tuple


Alias for field number 1


Alias for field number 0

allennlp.models.archival.archive_model(serialization_dir: str, weights: str = '', files_to_archive: typing.Dict[str, str] = None) → None[source]

Archive the model weights, its training configuration, and its vocabulary to model.tar.gz. Include the additional files_to_archive if provided.

serialization_dir: ``str``

The directory where the weights and vocabulary are written out.

weights: ``str``, optional (default=_DEFAULT_WEIGHTS)

Which weights file to include in the archive. The default is

files_to_archive: ``Dict[str, str]``, optional (default=None)

A mapping {flattened_key -> filename} of supplementary files to include in the archive. That is, if you wanted to include params['model']['weights'] then you would specify the key as “model.weights”.

allennlp.models.archival.load_archive(archive_file: str, cuda_device: int = -1, overrides: str = '', weights_file: str = None) → allennlp.models.archival.Archive[source]

Instantiates an Archive from an archived tar.gz file.

archive_file: ``str``

The archive file to load the model from.

weights_file: ``str``, optional (default = None)

The weights file to use. If unspecified, in the archive_file will be used.

cuda_device: ``int``, optional (default = -1)

If cuda_device is >= 0, the model will be loaded onto the corresponding GPU. Otherwise it will be loaded onto the CPU.

overrides: ``str``, optional (default = “”)

JSON overrides to apply to the unarchived Params object.