batch_norm_theano.py

Post Reply
alavinezhad
Posts: 1
Joined: Fri May 07, 2021 8:05 am

batch_norm_theano.py

Post by alavinezhad »

def momentum_updates(cost, params, lr, mu):
grads = T.grad(cost, params)
updates = []

for p, g in zip(params, grads):
dp = theano.shared(p.get_value() * 0)
new_dp = mu*dp - lr*g
new_p = p + new_dp
updates.append((dp, new_dp))
updates.append((p, new_p))
return updates

i can't understand this line: dp = theano.shared(p.get_value() * 0)? why p.get_value() is times at 0?
lazyprogrammer
Site Admin
Posts: 49
Joined: Sat Jul 28, 2018 3:46 am

Re: batch_norm_theano.py

Post by lazyprogrammer »

Thanks for your inquiry.

That's the momentum value, it should start at 0. Check out the momentum section earlier in the course.
Post Reply

Return to “Modern Deep Learning in Python”