我一直在研究CNTK,并决定创建一个模型来实现异或函数,以确保我掌握了基础知识。我创建了下面的文件,但由于模型表现非常差,我猜我遗漏了一些基本的东西。
command = Train:Output:DumpNodeInfomodelPath = "Models\xor.dnn"deviceId = -1makeMode = falsefeatureDimension = 2labelDimension = 1Train = [ action = "train" BrainScriptNetworkBuilder = { FDim = $featureDimension$ LDim = $labelDimension$ features = Input {FDim} labels = Input {LDim} W0 = ParameterTensor {(FDim:FDim)} ; b0 = ParameterTensor {FDim} W1 = ParameterTensor {(LDim:FDim)} ; b1 = ParameterTensor {LDim} o1 = W0*features + b0 z = Sigmoid (W1*o1 + b1) ce = SquareError (labels, z) errs = ClassificationError (labels, z) # root nodes featureNodes = (features) labelNodes = (labels) criterionNodes = (ce) evaluationNodes = (errs) outputNodes = (z) } SGD = [ epochSize = 0 minibatchSize = 1 learningRatesPerSample = 0.4 maxEpochs = 50 ] reader=[ readerType="CNTKTextFormatReader" file="Train_xor.txt" input = [ features = [ dim = $featureDimension$ alias = X format = "dense" ] labels = [ dim = $labelDimension$ alias = y format = "dense" ] ] ]]Output = [ action="write" reader=[ readerType="CNTKTextFormatReader" file="Train_xor.txt" input = [ features = [ dim = $featureDimension$ alias = X format = "dense" ] labels = [ dim = $labelDimension$ alias = y format = "dense" ] ] ] outputNodeNames = z outputPath = "Output\xor.txt"]DumpNodeInfo = [ action = "dumpNode" printValues = true]
输入文件看起来像这样
|y 0 |X 0 0|y 1 |X 1 0|y 1 |X 0 1|y 0 |X 1 1
我得到的输出是
0.4901560.4900920.4899840.489920
如果有帮助的话,节点转储看起来如下
b0=LearnableParameter [2,1] learningRateMultiplier=1.000000 NeedsGradient=true -0.00745151564 0.0358283482 #################################################################### b1=LearnableParameter [1,1] learningRateMultiplier=1.000000 NeedsGradient=true -0.0403601788 #################################################################### ce=SquareError ( labels , z ) errs=ClassificationError ( labels , z ) features=InputValue [ 2 ] labels=InputValue [ 1 ] o1=Plus ( o1.PlusArgs[0] , b0 ) o1.PlusArgs[0]=Times ( W0 , features ) W0=LearnableParameter [2,2] learningRateMultiplier=1.000000 NeedsGradient=true -0.0214280766 0.0442263819 -0.0401388146 0.0261882655 #################################################################### W1=LearnableParameter [1,2] learningRateMultiplier=1.000000 NeedsGradient=true -0.0281925034 0.0214234442 #################################################################### z=Sigmoid ( z._ ) z._=Plus ( z._.PlusArgs[0] , b1 ) z._.PlusArgs[0]=Times ( W1 , o1 )
回答:
你的隐藏单元中肯定需要一些非线性,比如 o1 = Tanh(W0*features + b0)
。一般来说,通过SGD学习异或函数时使用两个隐藏单元是比较棘手的:有很多随机初始化可能会导致发散。如果你有三个或更多的隐藏单元,学习会变得容易得多。