deep - Modifying perform function in Theano.tensor.nnet.softmax -


i have begun using lasagne , theano machine learning on python.

i trying modify softmax class in theano. want change how activation function(softmax) calculated. instead of dividing e_x e_x.sum(axis=1), want divide e_x sum of 3 consecutive numbers.

for instance, result follows:

sm[0] = e_x[0]/(e_x[0]+e_x[1]+e_x[2]) sm[1] = e_x[1]/(e_x[0]+e_x[1]+e_x[2]) sm[2] = e_x[2]/(e_x[0]+e_x[1]+e_x[2]) sm[3] = e_x[3]/(e_x[3]+e_x[4]+e_x[5]) sm[4] = e_x[4]/(e_x[3]+e_x[4]+e_x[5]) sm[5] = e_x[5]/(e_x[3]+e_x[4]+e_x[5]) 

and on...

the problem cannot quite grasp how theano carries out computation.

here main question. suffice change perform() function in softmax class?

here original perform() function:

def perform(self, node, input_storage, output_storage):     x, = input_storage     e_x = numpy.exp(x - x.max(axis=1)[:, none])     sm = e_x / e_x.sum(axis=1)[:, none]     output_storage[0][0] = sm 

here modified perform()

def myperform(self, node, input_storage, output_storage):     x, = input_storage     e_x = numpy.exp(x - x.max(axis=1)[:, none])     sm = numpy.zeros_like(e_x)     in range(0,symbolcount):         total = e_x[3*i] + e_x[3*i+1] + e_x[3*i+2]         sm[3*i] = e_x[3*i]/total         sm[3*i+1] = e_x[3*i+1]/total         sm[3*i+2] = e_x[3*i+2]/total     output_storage[0][0] = sm 

with current code, getting 'unorderable types:int()>str()' error when use predict method in lasagne.

for you're better off constructing custom softmax via symbolic expressions rather creating (or modifying) operation.

your custom softmax can defined in terms of symbolic expressions. doing way give gradients (and other theano operation bits , pieces) "for free" might run slower custom operation could.

here's example:

import numpy import theano import theano.tensor tt  x = tt.matrix()  # use built in softmax operation y1 = tt.nnet.softmax(x)  # regular softmax operation defined via ordinary theano symbolic expressions y2 = tt.exp(x) y2 = y2 / y2.sum(axis=1)[:, none]  # custom softmax operation def custom_softmax(a):     b = tt.exp(a)     b1 = b[:, :3] / b[:, :3].sum(axis=1)[:, none]     b2 = b[:, 3:] / b[:, 3:].sum(axis=1)[:, none]     return tt.concatenate([b1, b2], axis=1) y3 = custom_softmax(x)  f = theano.function([x], outputs=[y1, y2, y3])  x_value = [[.1, .2, .3, .4, .5, .6], [.1, .3, .5, .2, .4, .6]] y1_value, y2_value, y3_value = f(x_value) assert numpy.allclose(y1_value, y2_value) assert y3_value.shape == y1_value.shape = numpy.exp(.1) + numpy.exp(.2) + numpy.exp(.3) b = numpy.exp(.4) + numpy.exp(.5) + numpy.exp(.6) c = numpy.exp(.1) + numpy.exp(.3) + numpy.exp(.5) d = numpy.exp(.2) + numpy.exp(.4) + numpy.exp(.6) assert numpy.allclose(y3_value, [     [numpy.exp(.1) / a, numpy.exp(.2) / a, numpy.exp(.3) / a, numpy.exp(.4) / b, numpy.exp(.5) / b, numpy.exp(.6) / b],     [numpy.exp(.1) / c, numpy.exp(.3) / c, numpy.exp(.5) / c, numpy.exp(.2) / d, numpy.exp(.4) / d, numpy.exp(.6) / d] ]), y3_value 

Comments

Popular posts from this blog

c# - Binding a comma separated list to a List<int> in asp.net web api -

Delphi 7 and decode UTF-8 base64 -

html - Is there any way to exclude a single element from the style? (Bootstrap) -