You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When use decomposable_attention for training model it throws exception because there is no attribute in config called WORD_EMBEDDING_DIM. I think you should change this to following:
nb_words = min(TrainConfig.MAX_NB_WORDS, len(word_index)) + 1
embedding_matrix = np.zeros((nb_words, TrainConfig.WORD_EMBEDDING_DIM))
for word, i in word_index.items():
if word in word2vec.vocab:
embedding_matrix[i] = word2vec.word_vec(word)
print('Null word embeddings: %d' % np.sum(np.sum(embedding_matrix, axis=1) == 0))
The text was updated successfully, but these errors were encountered:
When use
decomposable_attention
for training model it throws exception because there is no attribute in config calledWORD_EMBEDDING_DIM
. I think you should change this to following:The text was updated successfully, but these errors were encountered: