Discuss, Learn and be Happy דיון בשאלות

help brightness_4 brightness_7 format_textdirection_r_to_l format_textdirection_l_to_r

Which activation function transforms the weighted output into the range [0,1]?

1
done
by
מיין לפי

Why are saturated neurons problematic in neural networks using the sigmoid activation function?

1
done
by
מיין לפי

Which activation function is known for being not zero-centered?

1
done
by
מיין לפי

Why is non-linearity crucial in neural networks?

1
done
by
מיין לפי

In comparison to Sigmoid, ReLU based functions train much faster, are more expressive than logistic function, prevent gradient vanishing.

1
by
מיין לפי

What is the primary function of the layers in a deep neural network architecture?

1
done
by
מיין לפי

What is the primary idea behind Convolutional Neural Networks (CNNs)?

1
done
by
מיין לפי

In neural networks, the role of a cost function is to quantify how well the neural network model is performing on a given task

1
done
by
מיין לפי

What is the primary purpose of backpropagation in neural network training?

1
done
by
מיין לפי

How are the parameters updated in backpropagation?

1
done
by
מיין לפי