2021年4月9日星期五

How could I add vgg16 intermediate layer feature with the top layer of the same network?

I would like to add features from the shallow layer with the top layer, suppose, 56 x 56x 512 with 14x14x512, then I upsampled that by 4 and got the same 56x56x512, add that and then downsampled by stride convolution/max-pooling and then fed to RPN (after converting required size) for proposal generation. but during training in the RPN layer, it shows a size mismatch in one position.

Block 1

x = Conv2D(64, (3, 3), activation='relu', padding='same', name='block1_conv1')(img_input)  x = Conv2D(64, (3, 3), activation='relu', padding='same', name='block1_conv2')(x)  x = MaxPooling2D((2, 2), strides=(2, 2), name='block1_pool')(x)    # Block 2  x = Conv2D(128, (3, 3), activation='relu', padding='same', name='block2_conv1')(x)  x = Conv2D(128, (3, 3), activation='relu', padding='same', name='block2_conv2')(x)  x = MaxPooling2D((2, 2), strides=(2, 2), name='block2_pool')(x)    # Block 3  x_3= x  x = Conv2D(256, (3, 3), activation='relu', padding='same', name='block3_conv1')(x)  x = Conv2D(256, (3, 3), activation='relu', padding='same', name='block3_conv2')(x)  x = Conv2D(256, (3, 3), activation='relu', padding='same', name='block3_conv3')(x)  x = MaxPooling2D((2, 2), strides=(2, 2), name='block3_pool')(x)    #edit (conv4_b)    x_3 = Conv2D(256, (3, 3), activation='relu', padding='same', name='block_3_conv1')(x_3)  x_3 = Conv2D(256, (3, 3), activation='relu', padding='same', name='block_3_conv2')(x_3)  x_3 = Conv2D(256, (3, 3), activation='relu', padding='same', name='block_3_conv3')(x_3)  x_3 = Conv2D(512, (1, 1), activation='relu', padding='same', name='block_3_conv4')(x_3)    # Block 4    x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block4_conv1')(x)  x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block4_conv2')(x)  x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block4_conv3')(x)  x = MaxPooling2D((2, 2), strides=(2, 2), name='block4_pool')(x)    #edit (conv5_b)  x_4=x_3  x_4 = Conv2D(512, (3, 3), activation='relu', padding='same', name='block_4_conv1')(x_4)  x_4 = Conv2D(512, (3, 3), activation='relu', padding='same', name='block_4_conv2')(x_4)  x_4 = Conv2D(512, (3, 3), activation='relu', padding='same', name='block_4_conv3')(x_4)        # Block 5    x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block5_conv1')(x)  x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block5_conv2')(x)  x = Conv2D(512, (3, 3), activation='relu', padding='same', name='block5_conv3')(x)  # x = MaxPooling2D((2, 2), strides=(2, 2), name='block5_pool')(x)    #edit    x = tf.keras.layers.UpSampling2D(size=(4, 4),interpolation='nearest')(x)           x = tf.keras.layers.Add()([x_4, x])    print(x_4.shape)  print(x.shape)    x = Conv2D(512, (1, 1), activation='relu', padding='same',strides=(2,2), name='block5_conv4')(x)   x = MaxPooling2D((2, 2), strides=(2, 2), name='block5_pool')(x)      print(x.shape)    return x  

x_4.shape (None, 56, 56, 512) x.shape (None, 56, 56, 512)

(return)x.shape (None, 14, 14, 512)

https://stackoverflow.com/questions/67029916/how-could-i-add-vgg16-intermediate-layer-feature-with-the-top-layer-of-the-same April 10, 2021 at 09:06AM

没有评论:

发表评论