Get model internals through beam search (copynet)

Hi,

I’m trying to visualize (by DEMO) the attention weights of an existing model (copynent_seq2seq). Unfortunately, this information isn’t passed in the outputs dictionary, so it seems like I have to copy the model (ironic, I know…;)) and make some changes.

The relevant information is computed in _decoder_step() and take_search_step(), and should go all the way up to forward() which returns the outputs dictionary. So the path:
_decoder_step() -> take_search_step()-> _beam_search.search() ->_forward_beam_search() ->forward()
I thought to pass it as state[“attentive_weights”] state[“selective_weights”], but _beam_search.search() doesn’t return states. There is a “clean” (minimal changes) way to do so?

And more generally, I think it would be great to be able to get any internal data of any (existing) model, maybe by variable naming convention (?). It is so important for interpretability.

Thanks!

Hi, this is a little complicated but you should already be able to do this using pytorch forward/backward hooks. We have used this on our predictor modules for demo purposes also, see here:

Im not sure I totally got how does it work, but I’ll try it! Thanks!
(Meanwhile I implemented the changes on my first post. A bit ugly, but works…)

Hey, I tried the hooks trick, but still have some issues:

  1. The naming is so confusing, it’s almost impossible to find the right internals
  2. In my case, I couldn’t find the “selective_weights” (found just the attentive ones, by value type (not by name)). I think it is not cached by the hook (not a module)
  3. The internals are so heavy so they are not consumable via a demo

For 1+2 I guess some convention like “.” could be nice (I know it’s on “torch”… but maybe we can declare some best practice?),
for 3, I think “predict_json()” can get an optional list of desired keys path to return (for example “model_internals._decoder_step.attentive_weights”)

If there are some exist solutions for these I would like to here about:)
Thanks a lot!