Decision Tree:-
Entropy is defined as randomness or measure of disorder of in
④ Decision Tree is a Supervised learning technique that can processed in Machine learning.
be used for both classification and Regression problems. In simple words, we can say that Entropy in machine
However it is more preferred -for classification. is the metric that measures the unpredictability a
② It is a tree structured classifier where the internal impurity in the system.
nodes represent the features of data set and the
branches represent the- decision rules. & each leaf node
represents the outcome.
③ •There are 2 types of nodes, which are decision nodes &
leaf node.
④ Decision nodes can make decisions and have multiple branches
and leaf node is the final outcome of these decisions Lees
and do not contain further branches. FImpure Impure Minimum
Impurity
also called It determines how decisions tree chooses to split data
Decision Root node
Node
Sub-Tree
Decision Decision
node node Low Entropy High Entropy
Information Gain:
when we use a node in decision tree t
leaf
node Decision leaf data into smaller subsets the entrop
leaf
node node node This change in entropy is called infor
Entropy is given by,
Leaf leaf ECS) = -¥4 Pilog, Cpi)
node node
# Gini Impurity:-
① The working of Gini Impurity is somew
entropy. in Decision Tree.
② Both of them are used for building
# Naive Bayes
quite a difference in both of them's
① Naive Bayes algorithm is a supervised learning algorithm,
③ Gini Impurity after splitting can be
which means that it also works on labelled data. formula,
② Naive Bayes algorithm uses Bayes theorem and is used for