How to load finetuned sciBERT NER model in allenlp?

I have trained the NER model on sciie dataset using the following config:

with_finetuning='_finetune' #'_finetune'  # or '' for not fine tuning

export BERT_VOCAB=/home/tomaz/neo4j/scibert/model/vocab.txt
export BERT_WEIGHTS=/home/tomaz/neo4j/scibert/model/weights.tar.gz

Then we can execute the training part with the following command:

python -m train $CONFIG_FILE  --include-package scibert -s "$@" 

This worked nicely and I got a model.tar.gz as an output. Now when I try to load it in AllenNLP lib:

from allennlp.predictors.predictor import Predictor
predictor = Predictor.from_path("model.tar.gz")

I get the following error:

ConfigurationError: bert-pretrained not in acceptable choices for dataset_reader.token_indexers.bert.type: [‘single_id’, ‘characters’, ‘elmo_characters’, ‘spacy’, ‘pretrained_transformer’, ‘pretrained_transformer_mismatched’]. You should either use the --include-package flag to make sure the correct module is loaded, or use a fully qualified class name in your config file like {“model”: “my_module.models.MyModel”} to have it imported automatically.

Any idea how to fix this? Using allenlp 1.2.1. I can’t see any example how to include a package when using allennlp as a python library.

All you have to do is import the Python module that defines bert-pretrained at the top of your file.

Let me know if that worked!