<P> Decision trees can also be seen as generative models of induction rules from empirical data . An optimal decision tree is then defined as a tree that accounts for most of the data, while minimizing the number of levels (or "questions"). Several algorithms to generate such optimal trees have been devised, such as ID3 / 4 / 5, CLS, ASSISTANT, and CART . </P> <P> Among decision support tools, decision trees (and influence diagrams) have several advantages . Decision trees: </P> <Ul> <Li> Are simple to understand and interpret . People are able to understand decision tree models after a brief explanation . </Li> <Li> Have value even with little hard data . Important insights can be generated based on experts describing a situation (its alternatives, probabilities, and costs) and their preferences for outcomes . </Li> <Li> Help determine worst, best and expected values for different scenarios . </Li> <Li> Use a white box model . If a given result is provided by a model . </Li> <Li> Can be combined with other decision techniques . </Li> </Ul> <Li> Are simple to understand and interpret . People are able to understand decision tree models after a brief explanation . </Li>

Advantage and disadvantage of decision tree in data mining