allennlp.data.dataset_readers.dataset_reader

class allennlp.data.dataset_readers.dataset_reader.DatasetReader(lazy: bool = False) → None[source]

Bases: allennlp.common.registrable.Registrable

A DatasetReader knows how to turn a file containing a dataset into a collection of Instance s. To implement your own, just override the _read(file_path) method to return an Iterable of the instances. This could be a list containing the instances or a lazy generator that returns them one at a time.

All parameters necessary to _read the data apart from the filepath should be passed to the constructor of the DatasetReader.

Parameters:
lazy : bool, optional (default=False)

If this is true, instances() will return an object whose __iter__ method reloads the dataset each time it’s called. Otherwise, instances() returns a list.

read(file_path: str) → typing.Iterable[allennlp.data.instance.Instance][source]

Returns an Iterable containing all the instances in the specified dataset.

If self.lazy is False, this calls self._read(), ensures that the result is a list, then returns the resulting list.

If self.lazy is True, this returns an object whose __iter__ method calls self._read() each iteration. In this case your implementation of _read() must also be lazy (that is, not load all instances into memory at once), otherwise you will get a ConfigurationError.

In either case, the returned Iterable can be iterated over multiple times. It’s unlikely you want to override this function, but if you do your result should likewise be repeatedly iterable.

text_to_instance(*inputs) → allennlp.data.instance.Instance[source]

Does whatever tokenization or processing is necessary to go from textual input to an Instance. The primary intended use for this is with a Predictor, which gets text input as a JSON object and needs to process it to be input to a model.

The intent here is to share code between _read() and what happens at model serving time, or any other time you want to make a prediction from new data. We need to process the data in the same way it was done at training time. Allowing the DatasetReader to process new text lets us accomplish this, as we can just call DatasetReader.text_to_instance when serving predictions.

The input type here is rather vaguely specified, unfortunately. The Predictor will have to make some assumptions about the kind of DatasetReader that it’s using, in order to pass it the right information.