To build and evaluate a Decision Tree using the ID3 algorithm
- Select the corpus to execute the ID3 algo and click on the SUBMIT button.
- To proceed further, click on the NEXT button.
- Now, find the entropy of the whole dataset by entering the required values (total Yes and total No) and click on the SUBMIT button.
- Repeat step 3 to find entropy for every individual attribute value.
- Click on the NEXT button.
- Now, find the information gain for the first attribute i.e., Outlook.
- Enter the required values.
- To access the value of entropy of the dataset (H(D)) and value of attribute (H(Dᵥ)), click on the tabs given on the left side.
- Click on the SUBMIT button to get the gain value.
- Repeat step 6 to find the information gain for the other attributes (temperature, humidity and wind).
NOTE: The attribute having highest Gain value is selected as the ROOT node for the decision tree.
- Click on the NEXT button to compute the tree.
- Now, find the entropy for the 'leaf' node (which has both YES and NO examples) of the tree by entering the required values and then clicking on the SUBMIT button.
- Now, find the entropy for every individual attribute value by clicking on the ENTROPY button.
- Now, find the information gain for every attribute by clicking on the IG button.
- Click on the NEXT button.
- Repeat steps 8 to 11 until we get the final decision tree.
- Finally, click on the PLOT button.