Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About mask layer #20440

Open
xiaohua6689 opened this issue Nov 2, 2024 · 3 comments
Open

About mask layer #20440

xiaohua6689 opened this issue Nov 2, 2024 · 3 comments

Comments

@xiaohua6689
Copy link

model

a = Input(shape=[5])
b = Input(shape=[10])
emb_a = Embedding(8, 5, mask_zero=True)(a)
emb_b = Embedding(20, 5, mask_zero=True)(b)
cat = Concatenate(axis=1)([emb_a, emb_b])
lstm = LSTM(16)(cat)
dense = Dense(1)(lstm)
model = Model(inputs=[a, b], outputs=[dense])
##data
seed_value = 33
np.random.seed(seed_value)
tf.random.set_seed(seed_value)
input_a = np.random.randint(0, 8, size=(1, 5)) # [0, 7]
input_b = np.random.randint(0, 20, size=(1, 10)) # [0, 19]

Error information

ValueError: Exception encountered when calling BroadcastTo.call().

Attempt to convert a value (None) with an unsupported type (<class 'NoneType'>) to a Tensor.

Arguments received by BroadcastTo.call():
• x=tf.Tensor(shape=(1, 5, 1), dtype=bool)

@mehtamansi29
Copy link
Collaborator

Hi @xiaohua6689 -

Thanks for reporting the issue. Here you are getting ValueError: Exception encountered when calling BroadcastTo.call(). error because mismatch in input layer shape of model and provided data(input_a,input_b).
Here simple using Reshaping layer to emb_a and emb_b output with reshape provided data will resolve the error.

emb_a = Reshape((5, 5))(emb_a)
emb_b = Reshape((10, 5))(emb_b)

Attached gist for the reference.

@xiaohua6689
Copy link
Author

Using Rashape layer encountered new Error " UserWarning: Layer 'reshape' (of type Reshape) was passed an input with a mask attached to it. However, this layer does not support masking and will therefore destroy the mask information. Downstream layers will not see the mask."
New information for you, the version of tensorflow installed on my pc is 2.18.0, when i back the version of tensorflow to 2.10.0, the code run well.

Hi @xiaohua6689 -

Thanks for reporting the issue. Here you are getting ValueError: Exception encountered when calling BroadcastTo.call(). error because mismatch in input layer shape of model and provided data(input_a,input_b). Here simple using Reshaping layer to emb_a and emb_b output with reshape provided data will resolve the error.

emb_a = Reshape((5, 5))(emb_a)
emb_b = Reshape((10, 5))(emb_b)

Attached gist for the reference.

@mehtamansi29
Copy link
Collaborator

Hi @xiaohua6689 -

Here you can use separate keras.layers.Masking layer along with embedding layer. Embedding layer represent the integer into dense vectors and masking is used to handle variable length sequences.

a = Input(shape=[5],dtype=tf.int32)
b = Input(shape=[10],dtype=tf.int32)
emb_a = Embedding(8, 5)(a)
emb_b = Embedding(20, 5)(b)

mask_a = Masking(mask_value=0)(emb_a)
mask_b = Masking(mask_value=0)(emb_b)

emb_a = Reshape((5, 5))(emb_a)
mask_a = Reshape((5, 5))(mask_a)
emb_b = Reshape((10, 5))(emb_b)
mask_b = Reshape((10, 5))(mask_b)

cat = Concatenate(axis=1)([emb_a, emb_b])
lstm = LSTM(16)(cat)
dense = Dense(1)(lstm)
model = Model(inputs=[a, b], outputs=[dense])
model.compile(optimizer='adam', loss='mse')
model.summary()

Attached gist for the reference.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants