2021年5月5日星期三

ValueError: logits and labels must have the same shape ((None, 1) vs (None, 2))

So I am trying to build a neural network with multiple outputs. I want to recognize gender and age using a face image and then I will further add more outputs once this issue is resolved.

Input-type = Image (originally 200200, resized to 6464)
Output-type = Array(len = 2)

(Pdb) x_train.shape <br/>  (18965, 64, 64, 1)<br/>  (Pdb) y_train.shape <br/>  (18965, 2)<br/>  (Pdb) x_test.shape<br/>    (4742, 64, 64, 1)<br/>  (Pdb) y_test.shape <br/>  (4742, 2)<br/>  

Neural Network:-

   input_layer = Input((x_train[0].shape))     conv1 = Conv2D(64,(4,4),activation="relu",strides=(2,2))(input_layer)     batchnorm1 = BatchNormalization()(conv1)     maxpool1 = MaxPooling2D((2,2))(batchnorm1)     drop1 = Dropout(0.3)(maxpool1)     conv2 = Conv2D(64,(4,4),activation="relu",strides=(2,2))(drop1)     batchnorm2 = BatchNormalization()(conv2)     maxpool2 = MaxPooling2D((2,2))(batchnorm2)     drop2 = Dropout(0.3)(maxpool2)     conv3 = Conv2D(64,(2,2), activation='relu',strides=(1,1))(drop2)     dense = Dense(32,activation = 'relu')(conv3)     flat = Flatten()(dense)     dense1 = Dense(32,activation = 'relu')(flat)     dense2 = Dense(32,activation = 'relu')(flat)     age_out = Dense(1,activation = 'relu',name = 'age')(dense1)     gen_out = Dense(1,activation= 'sigmoid', name = 'gen')(dense2)     def scheduler(epoch, lr):             if epoch<25:                 return lr             elif epoch%25==0:                 return lr * 0.5     learning_rate = tf.keras.callbacks.LearningRateScheduler(scheduler)     adam = Adam(learning_rate=5e-5)     model = Model(inputs=input_layer, outputs = [gen_out,age_out])     # model._set_output_names       model.compile(loss={'gen':"binary_crossentropy",'age':"mae"},optimizer = adam, metrics=['accuracy'])     model.fit(x_train,y_train,batch_size=50,validation_data = (x_test,y_test),epochs=100, callbacks=[learning_rate])  

Error:-

Traceback (most recent call last):    File "F:\projects\Ultimate Project\age.py", line 86, in <module>      neural_network()    File "F:\projects\Ultimate Project\age.py", line 78, in neural_network      model.fit(x_train,y_train,batch_size=50,validation_data = (x_test,y_test),epochs=100, callbacks=[learning_rate])    File "F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\keras\engine\training.py", line 1100, in fit      tmp_logs = self.train_function(iterator)    File "F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\eager\def_function.py", line 828, in __call__      result = self._call(*args, **kwds)    File "F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\eager\def_function.py", line 871, in _call      self._initialize(args, kwds, add_initializers_to=initializers)    File "F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\eager\def_function.py", line 726, in _initialize      *args, **kwds))    File "F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\eager\function.py", line 2969, in _get_concrete_function_internal_garbage_collected      graph_function, _ = self._maybe_define_function(args, kwargs)    File "F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\eager\function.py", line 3361, in _maybe_define_function      graph_function = self._create_graph_function(args, kwargs)    File "F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\eager\function.py", line 3206, in _create_graph_function      capture_by_value=self._capture_by_value),    File "F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\framework\func_graph.py", line 990, in func_graph_from_py_func      func_outputs = python_func(*func_args, **func_kwargs)    File "F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\eager\def_function.py", line 634, in wrapped_fn      out = weak_wrapped_fn().__wrapped__(*args, **kwds)    File "F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\framework\func_graph.py", line 977, in wrapper      raise e.ag_error_metadata.to_exception(e)  ValueError: in user code:        F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\keras\engine\training.py:805 train_function  *          return step_function(self, iterator)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\keras\engine\training.py:795 step_function  **          outputs = model.distribute_strategy.run(run_step, args=(data,))      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:1259 run          return self._extended.call_for_each_replica(fn, args=args, kwargs=kwargs)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:2730 call_for_each_replica          return self._call_for_each_replica(fn, args, kwargs)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:3417 _call_for_each_replica          return fn(*args, **kwargs)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\keras\engine\training.py:788 run_step  **          outputs = model.train_step(data)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\keras\engine\training.py:756 train_step          y, y_pred, sample_weight, regularization_losses=self.losses)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\keras\engine\compile_utils.py:203 __call__          loss_value = loss_obj(y_t, y_p, sample_weight=sw)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\keras\losses.py:152 __call__          losses = call_fn(y_true, y_pred)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\keras\losses.py:256 call  **          return ag_fn(y_true, y_pred, **self._fn_kwargs)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\util\dispatch.py:201 wrapper          return target(*args, **kwargs)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\keras\losses.py:1608 binary_crossentropy          K.binary_crossentropy(y_true, y_pred, from_logits=from_logits), axis=-1)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\util\dispatch.py:201 wrapper          return target(*args, **kwargs)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\keras\backend.py:4979 binary_crossentropy          return nn.sigmoid_cross_entropy_with_logits(labels=target, logits=output)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\util\dispatch.py:201 wrapper          return target(*args, **kwargs)      F:\projects\Ultimate Project\env\lib\site-packages\tensorflow\python\ops\nn_impl.py:174 sigmoid_cross_entropy_with_logits          (logits.get_shape(), labels.get_shape()))        ValueError: logits and labels must have the same shape ((None, 1) vs (None, 2))  

As far as I can understand, this error means that the neural network is giving output in the shape of (none,1) but my actual output is of (none,2). If I am using this config for loss model.compile(loss=["binary_crossentropy","mae"],optimizer = adam, metrics=['accuracy']) with model = Model(inputs=input_layer, outputs = tf.keras.layers.concatenate([gen_out,age_out], axis=-1)) then it runs but it gives me only a single loss instead of separate losses for both output layers and that too goes negative after some epochs.

I tried my best to explain my problem if there is any confusion then my apologies, let me know and I will add it.

https://stackoverflow.com/questions/67406909/valueerror-logits-and-labels-must-have-the-same-shape-none-1-vs-none-2 May 06, 2021 at 02:31AM

没有评论:

发表评论