"Mxnet"의 두 판 사이의 차이
ph
잔글 |
(→etc) |
||
| 8번째 줄: | 8번째 줄: | ||
==etc== | ==etc== | ||
| + | * batch normalization example | ||
| + | <pre>def ConvFactory(data, num_filter, kernel, stride=(1,1), pad=(0, 0),name=None, suffix=''): | ||
| + | conv = mx.sym.Convolution(data=data, num_filter=num_filter, kernel=kernel, | ||
| + | stride=stride, pad=pad, name='conv_%s%s' %(name, suffix)) | ||
| + | bn = mx.sym.BatchNorm(data=conv, name='bn_%s%s' %(name, suffix)) | ||
| + | act = mx.sym.Activation(data=bn, act_type='relu', name='relu_%s%s' | ||
| + | %(name, suffix)) | ||
| + | return act</pre> | ||
2017년 7월 4일 (화) 18:38 판
뭘 이렇게들 만들어 대는지. tf가 맘에 안들기는 하지만.
etc
- batch normalization example
def ConvFactory(data, num_filter, kernel, stride=(1,1), pad=(0, 0),name=None, suffix=''):
conv = mx.sym.Convolution(data=data, num_filter=num_filter, kernel=kernel,
stride=stride, pad=pad, name='conv_%s%s' %(name, suffix))
bn = mx.sym.BatchNorm(data=conv, name='bn_%s%s' %(name, suffix))
act = mx.sym.Activation(data=bn, act_type='relu', name='relu_%s%s'
%(name, suffix))
return act