adplus-dvertising
frame-decoration

Question

The update in weight vector in basic competitive learning can be represented by?

a.

w(t + 1) = w(t) + del.w(t)

b.

w(t + 1) = w(t)

c.

w(t + 1) = w(t) – del.w(t)

d.

none of the mentioned

Posted under Neural Networks

Answer: (a).w(t + 1) = w(t) + del.w(t)

Engage with the Community - Add Your Comment

Confused About the Answer? Ask for Details Here.

Know the Explanation? Add it Here.

Q. The update in weight vector in basic competitive learning can be represented by?

Similar Questions

Discover Related MCQs

Q. An instar can respond to a set of input vectors even if its not trained to capture the behaviour of the set?

Q. The weight change in plain hebbian learning is?

Q. What is the nature of weights in plain hebbian learning?

Q. How can divergence be prevented?

Q. By normalizing the weight at every stage can we prevent divergence?

Q. What is ojas rule?

Q. What is the other name of feedback layer in competitive neural networks?

Q. What kind of feedbacks are given in competitive layer?

Q. Generally how many kinds of pattern storage network exist?

Q. In competitive learning, node with highest activation is the winner, is it true?

Q. What kind of learning is involved in pattern clustering task?

Q. In pattern clustering, does physical location of a unit relative to other unit has any significance?

Q. How is feature mapping network distinct from competitive learning network?

Q. What is the objective of feature maps?

Q. How are weights updated in feature maps?

Q. In feature maps, when weights are updated for winning unit and its neighbour, which type learning it is known as?

Q. In self organizing network, how is layer connected to output layer?

Q. What is true regarding adaline learning algorithm?

Q. What is true for competitive learning?

Q. Use of nonlinear units in the feedback layer of competitive network leads to concept of?