我在使用Tensorflow 2时遇到了一个错误。如何解决它?
这是我的代码(假设已经导入了Keras的所有相关模块/对象):
dense1 = 2**7dense2 = 2**8dense3 = 2**9dropout = 0.8price_loss = 1cut_loss = 1activation= LeakyReLU()#====================================================================# 输入#====================================================================#----------------------------------------------------------------carat = Input( shape= (1,), batch_size= batch_size, name= 'carat')#----------------------------------------------------------------color = Input( shape= (1,), batch_size= batch_size, name= 'color')#----------------------------------------------------------------clarity = Input( shape= (1,), batch_size= batch_size, name= 'clarity')#----------------------------------------------------------------depth = Input( shape= (1,), batch_size= batch_size, name= 'depth')#----------------------------------------------------------------table = Input( shape= (1,), batch_size= batch_size, name= 'table')#----------------------------------------------------------------x = Input( shape= (1,), batch_size= batch_size, name= 'x')#----------------------------------------------------------------y = Input( shape= (1,), batch_size= batch_size, name= 'y')#----------------------------------------------------------------z = Input( shape= (1,), batch_size= batch_size, name= 'z')#----------------------------------------------------------------#====================================================================# 为分类特征"color"和"clarity"创建嵌入#====================================================================color = Embedding(input_dim = 7, output_dim = 1, name = 'color_emb')(color)clarity = Embedding(input_dim = 8, output_dim = 1, name = 'clarity_emb')(clarity)color = Flatten()(color)clarity = Flatten()(clarity)#====================================================================# 连接特征#====================================================================x = Concatenate()([color, clarity, carat, depth, table, x, y, z])#====================================================================# 密集网络#====================================================================x = Dense(dense1, activation = activation)(x)x = BatchNormalization()(x)x = Dense(dense2, activation = activation)(x)x = BatchNormalization()(x)x = Dense(dense3, activation = activation)(x)x = BatchNormalization()(x)x = Dropout(dropout)(x)#====================================================================# 预测# ====================================================================cut = Dense(1, activation = 'sigmoid')(x)price = Dense(1)(x)#====================================================================# 定义模型# ====================================================================model = Model(inputs = [carat, color, clarity, depth, table, x, y, z] , outputs = [cut , price])#====================================================================# 编译模型# ====================================================================model.compile( optimizer = 'Adam', loss = { "price": "huber_loss", "cut": "binary_crossentropy", }, loss_weights = [price_loss, cut_loss], metrics = { "price": ["mean_absolute_percentage_error"], "cut": [tf.keras.metrics.AUC(), tf.keras.metrics.Precision(thresholds = thresholds)], } )
堆栈跟踪:
WARNING:tensorflow:Functional inputs must come from `tf.keras.Input` (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to "functional_1" was not an Input tensor, it was generated by layer flatten_8.Note that input tensors are instantiated via `tensor = tf.keras.Input(shape)`.The tensor that caused the issue was: flatten_8/Reshape:0WARNING:tensorflow:Functional inputs must come from `tf.keras.Input` (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to "functional_1" was not an Input tensor, it was generated by layer flatten_9.Note that input tensors are instantiated via `tensor = tf.keras.Input(shape)`.The tensor that caused the issue was: flatten_9/Reshape:0WARNING:tensorflow:Functional inputs must come from `tf.keras.Input` (thus holding past layer metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input to "functional_1" was not an Input tensor, it was generated by layer dropout_2.Note that input tensors are instantiated via `tensor = tf.keras.Input(shape)`.The tensor that caused the issue was: dropout_2/cond/Identity:0---------------------------------------------------------------------------ValueError Traceback (most recent call last)<ipython-input-64-132a2d8458b9> in <module> 135 # ==================================================================== 136 --> 137 model = Model(inputs = [carat, color, clarity, depth, table, x, y, z] , outputs = [cut , price]) 138 139 #====================================================================~\AppData\Roaming\Python\Python37\site-packages\tensorflow\python\keras\engine\training.py in __new__(cls, *args, **kwargs) 240 # Functional model 241 from tensorflow.python.keras.engine import functional # pylint: disable=g-import-not-at-top--> 242 return functional.Functional(*args, **kwargs) 243 else: 244 return super(Model, cls).__new__(cls, *args, **kwargs)~\AppData\Roaming\Python\Python37\site-packages\tensorflow\python\training\tracking\base.py in _method_wrapper(self, *args, **kwargs) 455 self._self_setattr_tracking = False # pylint: disable=protected-access 456 try:--> 457 result = method(self, *args, **kwargs) 458 finally: 459 self._self_setattr_tracking = previous_value # pylint: disable=protected-access~\AppData\Roaming\Python\Python37\site-packages\tensorflow\python\keras\engine\functional.py in __init__(self, inputs, outputs, name, trainable) 113 # 'arguments during initialization. Got an unexpected argument:') 114 super(Functional, self).__init__(name=name, trainable=trainable)--> 115 self._init_graph_network(inputs, outputs) 116 117 @trackable.no_automatic_dependency_tracking~\AppData\Roaming\Python\Python37\site-packages\tensorflow\python\training\tracking\base.py in _method_wrapper(self, *args, **kwargs) 455 self._self_setattr_tracking = False # pylint: disable=protected-access 456 try:--> 457 result = method(self, *args, **kwargs) 458 finally: 459 self._self_setattr_tracking = previous_value # pylint: disable=protected-access~\AppData\Roaming\Python\Python37\site-packages\tensorflow\python\keras\engine\functional.py in _init_graph_network(self, inputs, outputs) 189 # Keep track of the network's nodes and layers. 190 nodes, nodes_by_depth, layers, _ = _map_graph_network(--> 191 self.inputs, self.outputs) 192 self._network_nodes = nodes 193 self._nodes_by_depth = nodes_by_depth~\AppData\Roaming\Python\Python37\site-packages\tensorflow\python\keras\engine\functional.py in _map_graph_network(inputs, outputs) 929 'The following previous layers ' 930 'were accessed without issue: ' +--> 931 str(layers_with_complete_input)) 932 for x in nest.flatten(node.outputs): 933 computable_tensors.add(id(x))ValueError: Graph disconnected: cannot obtain value for tensor Tensor("clarity_8:0", shape=(20, 1), dtype=float32) at layer "clarity_emb". The following previous layers were accessed without issue: []
回答: