Loading pre-trained NER fine-grained model gives error

import os
os.environ['KMP_DUPLICATE_LIB_OK']='True'
from allennlp.predictors.predictor import Predictor
import allennlp_models.ner.crf_tagger
predictor = Predictor.from_path("https://storage.googleapis.com/allennlp-public-models/fine-grained-ner-model-elmo-2020.02.10.tar.gz")

gives the following errors:
Traceback (most recent call last):
File “allennlp_ner.py”, line 7, in
predictor = Predictor.from_path(“https://storage.googleapis.com/allennlp-public-models/fine-grained-ner-model-elmo-2020.02.10.tar.gz”)
File “/anaconda3/lib/python3.7/site-packages/allennlp/predictors/predictor.py”, line 274, in from_path
load_archive(archive_path, cuda_device=cuda_device),
File “/anaconda3/lib/python3.7/site-packages/allennlp/models/archival.py”, line 197, in load_archive
opt_level=opt_level,
File “/anaconda3/lib/python3.7/site-packages/allennlp/models/model.py”, line 387, in load
return model_class._load(config, serialization_dir, weights_file, cuda_device, opt_level)
File “/anaconda3/lib/python3.7/site-packages/allennlp/models/model.py”, line 332, in _load
model.load_state_dict(model_state)
File “/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py”, line 847, in load_state_dict
self.class.name, “\n\t”.join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for CrfTagger:
Missing key(s) in state_dict: “encoder._module.forward_layer_0.cell.input_linearity.weight”, “encoder._module.forward_layer_0.cell.state_linearity.weight”, “encoder._module.forward_layer_0.cell.state_linearity.bias”, “encoder._module.backward_layer_0.cell.input_linearity.weight”, “encoder._module.backward_layer_0.cell.state_linearity.weight”, “encoder._module.backward_layer_0.cell.state_linearity.bias”, “encoder._module.forward_layer_1.cell.input_linearity.weight”, “encoder._module.forward_layer_1.cell.state_linearity.weight”, “encoder._module.forward_layer_1.cell.state_linearity.bias”, “encoder._module.backward_layer_1.cell.input_linearity.weight”, “encoder._module.backward_layer_1.cell.state_linearity.weight”, “encoder._module.backward_layer_1.cell.state_linearity.bias”.
Unexpected key(s) in state_dict: “encoder._module.forward_layer_0.input_linearity.weight”, “encoder._module.forward_layer_0.state_linearity.weight”, “encoder._module.forward_layer_0.state_linearity.bias”, “encoder._module.backward_layer_0.input_linearity.weight”, “encoder._module.backward_layer_0.state_linearity.weight”, “encoder._module.backward_layer_0.state_linearity.bias”, “encoder._module.forward_layer_1.input_linearity.weight”, “encoder._module.forward_layer_1.state_linearity.weight”, “encoder._module.forward_layer_1.state_linearity.bias”, “encoder._module.backward_layer_1.input_linearity.weight”, “encoder._module.backward_layer_1.state_linearity.weight”, “encoder._module.backward_layer_1.state_linearity.bias”.
size mismatch for text_field_embedder.token_embedder_token_characters._embedding._module.weight: copying a param with shape torch.Size([119, 25]) from checkpoint, the shape in current model is torch.Size([57007, 25]).

1 Like

I have this issue as well!

It is possible no one from AllenAI monitors this forum.

I have submitted the issue bug report at
https://github.com/allenai/allennlp/issues/4276