Alternative for maximum_samples_per_batch in >1.0

In version 0.9.0 I could train classification models for all GLUE tasks with MBERT and the basic classifier on a 12gb gpu (batch size 32). However, since version 1.0 it does not fit anymore for some tasks because I can not set the maximum_samples_per_batch anymore, is there an alternative to use in allennlp 1.0 or higher?

AllenNLP 1.0 has MaxTokensBatchSampler.

Does that do what you want?

1 Like

Thanks Dirk, that does indeed solve it!, somehow overlooked it.