Chapter Title:
Results
Book Title:
Synopsis
Requirements:-
Decision Tree: Decision tree is used for making a tree like systems for regression or class models. A selection tree creates a smaller and smaller subset of a trouble while an related selection tree is developed incrementally. Two or extra branches and leaf can appear in a selection tree which represents class. Both specific and numerical fee may be treated via way of means of a selection tree. The set of rules Decision tree can learn how to expect the fee of a goal variable via way of means of studying simple selection policies taken from the dataset. From the end result of our selection tree, we are able to easily apprehend how lots significance a particular characteristic has. we are able to see the characteristic .That is grew to become out to be a completely critical characteristic of our version. Here the selection tree learns the educate set version flawlessly and over fitting the records. That’s why it will supply a negative prediction. Other values of max_depth parameter need. to be attempted out, it's miles proven in Figure.
K- Nearest Neighbour (KNN): KNN is a supervised class set of rules (it takes a bunch of categorized factors and makes use of them how to label some other factor).To label a brand new factor it seems at the brand new factor nearest to it and votes for it and whichever label is the maximum voted that label is given to the brand new factor. Despite its simplicity, the end result is excellent so we placed special values for n.
Pages
Published
Series
Categories
License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.