<Tr> <Td> </Td> <Td> This section does not cite any sources . Please help improve this section by adding citations to reliable sources . Unsourced material may be challenged and removed . (July 2015) (Learn how and when to remove this template message) </Td> </Tr> <P> Amongst other data mining methods, decision trees have various advantages: </P> <Ul> <Li> Simple to understand and interpret . People are able to understand decision tree models after a brief explanation . Trees can also be displayed graphically in a way that is easy for non-experts to interpret . </Li> <Li> Able to handle both numerical and categorical data . Other techniques are usually specialised in analysing datasets that have only one type of variable . (For example, relation rules can be used only with nominal variables while neural networks can be used only with numerical variables or categoricals converted to 0 - 1 values .) </Li> <Li> Requires little data preparation . Other techniques often require data normalization . Since trees can handle qualitative predictors, there is no need to create dummy variables . </Li> <Li> Uses a white box model . If a given situation is observable in a model the explanation for the condition is easily explained by boolean logic . By contrast, in a black box model, the explanation for the results is typically difficult to understand, for example with an artificial neural network . </Li> <Li> Possible to validate a model using statistical tests . That makes it possible to account for the reliability of the model . </Li> <Li> Non-statistical approach that makes no assumptions of the training data or prediction residuals; e.g., no distributional, independence, or constant variance assumptions </Li> <Li> Performs well with large datasets . Large amounts of data can be analysed using standard computing resources in reasonable time . </Li> <Li> Mirrors human decision making more closely than other approaches . This could be useful when modeling human decisions / behavior . </Li> <Li> Robust against co-linearity, particularly boosting </Li> <Li> In built feature selection . Additional irrelevant feature will be less used so that they can be removed on subsequent runs . </Li> </Ul> <Li> Simple to understand and interpret . People are able to understand decision tree models after a brief explanation . Trees can also be displayed graphically in a way that is easy for non-experts to interpret . </Li>

What is the most commonly used criterion for decision tree​ analysis