[问题]mxnet 进行分析出现错误

楼主: savin (小马克)   2019-05-31 00:29:15
数据如下
No. x1 x2 x3 x4 y1 y2
1 -17.8 1.2 17.5 4.2 24.7 -56.4
2 -11.2 -3.7 0 14.4 -53.8 34.4
3 3.9 -5.7 20.9 7.8 18.3 22
4 4.2 -5.9 -10 -6.8 -47.5 43.3
5 -1.8 -13.8 -4.6 -10.4 -185.1 164.8
6 0 -3.4 10.6 6.1 8.2 8.5
...共200笔有y1,y2 资料80笔未知y1,y2
参考网络使用mxnet 但目前都跑不出来 =.=
小弟目前找不到可以解的方法 也爬过文 希望有大大可以帮忙一下
Error in symbol$infer.shape(list(...)) :
Error in operator dnn: Shape inconsistent, Provided=[200], inferred
shape=[200,10]
library(neuralnet) # for neuralnet(), nn model
library(nnet) # for class.ind()
library(xlsx)
library(mxnet)
#Load data
Data_Path<-"D:/2019/W21/R/DM_Assignment_4_NN.xlsx"
Data<-read.xlsx(file=Data_Path,sheetIndex=1 , header=T)
Data<-na.omit(Data) #删除没有y1,y2的资料
str(Data)
head(Data)
Data<-as.matrix(Data)
# 输入层
data <- mx.symbol.Variable("data")
# 第一隐藏层: 500节点,状态是Full-Connected
fc1 <- mx.symbol.FullyConnected(data, name="1-fc", num_hidden=500)
# 第一隐藏层的激发函数: Relu
act1 <- mx.symbol.Activation(fc1, name="relu1", act_type="relu")
# 这里引入dropout的概念
drop1 <- mx.symbol.Dropout(data=act1, p=0.5)
# 第二隐藏层: 400节点,状态是Full-Connected
fc2 <- mx.symbol.FullyConnected(drop1, name="2-fc", num_hidden=500)
# 第二隐藏层的激发函数: Relu
act2 <- mx.symbol.Activation(fc2, name="relu2", act_type="relu")
# 这里引入dropout的概念
drop2 <- mx.symbol.Dropout(data=act2, p=0.5)
# 输出层:因为预测数字为0~9共十个,节点为10
output <- mx.symbol.FullyConnected(drop2, name="output", num_hidden=10)
# Transfer the Evidence to Probability by Softmax-function
dnn <- mx.symbol.LinearRegressionOutput(output, name="dnn")
#分割资料
train.ind <- c(1:200)
train.x = data.matrix(Data[train.ind, 2:5])
train.y1 = c(Data[train.ind, 6])
train.y2 = data.matrix(Data[train.ind, 7])
test.x = data.matrix(Data[-train.ind, 2:5])
test.y1 = data.matrix(Data[-train.ind, 6])
test.y2 = data.matrix(Data[-train.ind, 7])
mx.set.seed(0)
# 训练刚刚创造/设计的模型
dnn.model <- mx.model.FeedForward.create(
dnn, # 刚刚设计的DNN模型
X=train.x, # train.x
y=train.y1, # train.y
ctx=mx.cpu(), # 可以决定使用cpu或gpu
num.round=4, # iteration round
array.batch.size=200, # batch size
learning.rate=0.05, # learn rate
momentum=0.9, # momentum
array.layout = "rowmajor",
eval.metric=mx.metric.accuracy, # 评估预测结果的基准函式*
initializer=mx.init.uniform(0.07), # 初始化参数
epoch.end.callback=mx.callback.log.train.metric(100)
)
作者: ching0629 (Syameroke)   2019-06-01 14:41:00
output这层的num_hidden由10改成1想好好学得话推荐一下:https://linchin.ndmctsgh.edu.tw/course.html

Links booklink

Contact Us: admin [ a t ] ucptt.com