1. Back propagation is a learning technique that adjusts weights in the neural network by propagating weight changes. a. Forward from source to sink b. Backward from sink to source c. Forward from source to hidden nodes d. Backward from sink to hidden nodes
 Answer: (b).Backward from sink to source

 2. Identify the following activation function :φ(V) = Z + (1/ 1 + exp (– x * V + Y) ),Z, X, Y are parameters a. Step function b. Ramp function c. Sigmoid function d. Gaussian function

 3. An artificial neuron receives n inputs x1, x2, x3............xn with  weights w1, w2, ..........wn attached to the input links. The weighted sum_________________ is computed to be passed on to a non-linear filter  Φ called activation function to release the output. a. Σ wi b. Σ xi c. Σ wi + Σ xi d. Σ wi* xi

 4. Match the following knowledge representation techniques with their applications:List – I List – II(a) Frames (i) Pictorial representation of objects, their attributes and relationships(b) Conceptual dependencies (ii) To describe real world stereotype events(c) Associative networks (iii) Record like structures for grouping closely related knowledge(d) Scripts (iv) Structures and primitives to represent sentencescode:a b c d a. (iii) (iv) (i) (ii) b. (iii) (iv) (ii) (i) c. (iv) (iii) (i) (ii) d. (iv) (iii) (ii) (i)

 5. In propositional logic P ⇔ Q is equivalent to (Where ~ denotes NOT): a. ~ (P ˅ Q) ˄ ~ (Q ˅ P) b. (~ P ˅ Q) ˄ (~ Q ˅ P) c. (P ˅ Q) ˄ (Q ˅ P) d. ~ (P ˅ Q) → ~ (Q ˅ P)
 Answer: (b).(~ P ˅ Q) ˄ (~ Q ˅ P)

 6. Slots and facets are used in a. Semantic Networks b. Frames c. Rules d. All of these

 7. A neuron with 3 inputs has the weight vector [0.2 -0.1 0.1]^T and a bias θ = 0. If the input vector is X = [0.2 0.4 0.2]^T then the total input to the neuron is: a. 0.20 b. 1.0 c. 0.02 d. -1.0