You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 3, 2022. It is now read-only.
Just curious, if it is possible to capture inter-dependencies in hidden states by using CRF as middle layer i.e.Model: "sequential_1"
as shown below without introducing extra energies and other factors ?
Layer (type) Output Shape Param #
lstm_1 (LSTM) (None, None, 256) 358400
crf_1 (CRF) (None, None, 256) 131840
lstm_2 (LSTM) (None, None, 256) 525312
dense_1 (Dense) (None, None, 93) 23901
I need to use it as middle layer for LSTM not to lose its char-based generative ability, otherwise if you use as last layer it easily learns that you need to shift whole sentence to the left.
With Best Regards, Andrei Buin.
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Just curious, if it is possible to capture inter-dependencies in hidden states by using CRF as middle layer i.e.Model: "sequential_1"
as shown below without introducing extra energies and other factors ?
Layer (type) Output Shape Param #
lstm_1 (LSTM) (None, None, 256) 358400
crf_1 (CRF) (None, None, 256) 131840
lstm_2 (LSTM) (None, None, 256) 525312
dense_1 (Dense) (None, None, 93) 23901
I need to use it as middle layer for LSTM not to lose its char-based generative ability, otherwise if you use as last layer it easily learns that you need to shift whole sentence to the left.
With Best Regards, Andrei Buin.
The text was updated successfully, but these errors were encountered: