adplus-dvertising
frame-decoration

Question

What is true regarding backpropagation rule?

a.

it is also called generalized delta rule

b.

error in output is propagated backwards only to determine weight updates

c.

there is no feedback of signal at nay stage

d.

all of the mentioned

Posted under Neural Networks

Answer: (d).all of the mentioned

Engage with the Community - Add Your Comment

Confused About the Answer? Ask for Details Here.

Know the Explanation? Add it Here.

Q. What is true regarding backpropagation rule?

Similar Questions

Discover Related MCQs

Q. There is feedback in final stage of backpropagation algorithm?

Q. What is meant by generalized in statement “backpropagation is a generalized delta rule” ?

Q. What are general limitations of back propagation rule?

Q. What are the general tasks that are performed with backpropagation algorithm?

Q. Does backpropagaion learning is based on gradient descent along error surface?

Q. How can learning process be stopped in backpropagation rule?

Q. Which is a simplest pattern recognition task in a feedback network?

Q. In a linear autoassociative network, if input is noisy than output will be noisy?

Q. Does linear autoassociative network have any practical use?

Q. What can be done by using non – linear output function for each processing unit in a feedback network?

Q. When are stable states reached in energy landscapes, that can be used to store input patterns?

Q. The number of patterns that can be stored in a given network depends on?

Q. What happens when number of available energy minima be less than number of patterns to be stored?

Q. What happens when number of available energy minima be more than number of patterns to be stored?

Q. How hard problem can be solved?

Q. Why there is error in recall, when number of energy minima is more the required number of patterns to be stored?