So I am doing a classification machine learning with the input of (batch, step, features).
In order to improve the accuracy of this model, I intended to apply a self-attention layer to it.
I am unfamiliar with how to use it for my case since most examples online are concerned with embedding NLP models.
def LSTM_attention_model(X_train, y_train, X_test, y_test, num_classes, loss,batch_size=68, units=128, learning_rate=0.005,epochs=20, dropout=0.2, recurrent_dropout=0.2,optimizer='Adam'): class myCallback(tf.keras.callbacks.Callback): def on_epoch_end(self, epoch, logs={}): if (logs.get('acc') > 0.90): print("\nReached 90% accuracy so cancelling training!") self.model.stop_training = True callbacks = myCallback() model = tf.keras.models.Sequential() model.add(Masking(mask_value=0.0, input_shape=(X_train.shape[1], X_train.shape[2]))) model.add(Bidirectional(LSTM(units, dropout=dropout, recurrent_dropout=recurrent_dropout))) model.add(SeqSelfAttention(attention_activation='sigmoid')) model.add(Dense(num_classes, activation='softmax')) opt = opt_select(optimizer) model.compile(loss=loss, optimizer=opt, metrics=['accuracy']) history = model.fit(X_train, y_train, batch_size=batch_size, epochs=epochs, validation_data=(X_test, y_test), verbose=1, callbacks=[callbacks]) score, acc = model.evaluate(X_test, y_test, batch_size=batch_size) yhat = model.predict(X_test) return history, that
This led to IndexError: list index out of range
What is the correct way to apply this layer to my model?
As requested, one may use the following codes to simulate a set of the dataset.
X_train = np.random.rand(700, 50,34) y_train = np.random.choice([0, 1], 700) X_test = np.random.rand(100, 50, 34) y_test = np.random.choice([0, 1], 100)
https://stackoverflow.com/questions/65402126/the-application-of-self-attention-layer-raised-index-error December 22, 2020 at 09:25AM
没有评论:
发表评论