Thursday

Concept of Bias and Threshold in Artificial Neural Networks (ANNs)

 

🧠 1. Artificial Neuron Model (Base Formula)

An artificial neuron computes a weighted sum of its inputs and then applies an activation function.

y=f(i=1nwixi+b)y = f\left(\sum_{i=1}^{n} w_i x_i + b\right)

where:

  • xix_i = input signals

  • wiw_i = weights

  • bb = bias

  • ff = activation function

  • yy = output of neuron


⚙️ 2. Concept of Bias

  • Bias is an additional constant input added to the weighted sum before applying the activation function.

  • It allows the activation function to shift left or right on the graph — helping the neuron activate even when all inputs are zero.

Mathematically:
If we remove bias, the equation is:

y=f(wixi)y = f\left(\sum w_i x_i\right)

The neuron can only learn functions that pass through the origin (0,0).

By adding bias bb:

y=f(wixi+b)y = f\left(\sum w_i x_i + b\right)

the line (or decision boundary) can shift away from the origin, improving flexibility.

🔹Analogy:
Think of bias as the intercept (c) in the line equation y=mx+cy = mx + c.
It helps control when the neuron "fires."


🔒 3. Concept of Threshold

  • The threshold is a value that determines whether the neuron should activate (output 1) or remain inactive (output 0).

  • It works like a cutoff point.

If:

wixithreshold, output =1\sum w_i x_i \geq \text{threshold}, \text{ output } = 1

else

output=0\text{output} = 0

Example:
Let threshold θ=0.5\theta = 0.5.
If weighted sum = 0.7 → output = 1
If weighted sum = 0.3 → output = 0


🔄 4. Relationship between Bias and Threshold

Bias and threshold serve opposite roles but are mathematically related.

b=θb = -\theta

So we can rewrite:

y=f(wixiθ)=f(wixi+b)y = f\left(\sum w_i x_i - \theta\right) = f\left(\sum w_i x_i + b\right)

That’s why modern neural networks use “bias” instead of “threshold” — it’s more convenient for computation.


5. Summary Table

ConceptMeaningRole
Bias (b)Constant added to weighted sumShifts activation curve left/right
Threshold (θ)Minimum value needed to activate neuronDecides when neuron fires
Relationb=θb = -θBias is negative of threshold

Example (Binary Step Neuron):

y={1,if (w1x1+w2x2+b)00,otherwisey = \begin{cases} 1, & \text{if } (w_1x_1 + w_2x_2 + b) \ge 0 \\ 0, & \text{otherwise} \end{cases}

Here, bias = -threshold, and it adjusts when the neuron turns ON.

Share this

0 Comment to "Concept of Bias and Threshold in Artificial Neural Networks (ANNs)"

Post a Comment

Note: Only a member of this blog may post a comment.