Thursday

Numerical Questions


Q1: Single-layer Perceptron Output

Inputs: x₁ = 1, x₂ = 0; Weights: w₁ = 0.6, w₂ = 0.4; Bias: b = -0.5. Activation: Binary Step.

Solution:
Weighted sum: net = w₁*x₁ + w₂*x₂ + b = 0.6*1 + 0.4*0 - 0.5 = 0.1
Binary step function: y = 1 if net ≥ 0 else 0
Output: y = 1
            

Q2: Hidden Layer Output in MLP

Inputs: X = [1, 0]; Weights: w₁₁=0.5, w₁₂=-0.4, w₂₁=0.3, w₂₂=0.2; Biases: b₁=0.1, b₂=-0.2; Activation: Sigmoid.

Solution:
Hidden neuron 1: net₁ = 0.5*1 + 0.3*0 + 0.1 = 0.6
h₁ = 1/(1 + e^(-0.6)) ≈ 0.645

Hidden neuron 2: net₂ = -0.4*1 + 0.2*0 -0.2 = -0.6
h₂ = 1/(1 + e^(0.6)) ≈ 0.354

Hidden layer outputs: [0.645, 0.354]
            

Q3: Backpropagation Weight Update

Output neuron: y = 0.6, Desired output: d = 1, Input to neuron: h = 0.5, Learning rate: η = 0.1, Activation: Sigmoid.

Solution:
Delta error: δ = (d - y) * y * (1 - y)
δ = (1 - 0.6) * 0.6 * 0.4 = 0.096

Weight update: Δw = η * δ * h = 0.1 * 0.096 * 0.5 = 0.0048
Updated weight: w_new = w_old + Δw
            


            

Share this

0 Comment to " Numerical Questions"

Post a Comment

Note: Only a member of this blog may post a comment.