FAQ Database Discussion Community


WEKA J48 decision tree with non linearly separable data

weka,decision-tree,j48,c4.5
Does Weka J48 Decision Tree classifier support classification for a problem with intrinsically non linearly separable data? In short, is J48 either a linear or a non linear classifier?

Python Scikit Decision Tree with variable number of outputs

python,scikit-learn,decision-tree
I'm looking to setup a multi-output decision tree using the Python SciKit library. The problem I'm facing however is that it's not a simple "n_outputs" classification. Some samples will have 3 outputs, some 4, some 5. I'm not sure what the best way is to convey this to the library....

chaid regression tree to table conversion in r

r,packages,regression,decision-tree
I used the CHAID package from this link ..It gives me a chaid object which can be plotted..I want a decision table with each decision rule in a column instead of a decision tree. .But i dont understand how to access nodes and paths in this chaid object..Kindly help me.....

How to stratify data using Orange?

python,decision-tree,orange
Looking for some help from the Orange experts out there. I have a data set of about 6 million lines. For simplicity's sake, we'll consider only two columns. One is of positive decimal numbers and is imported as a continuous value. The other is of discrete values (either 0 or...

Select a random value according to a distribution , java equivalent [closed]

java,classification,distribution,decision-tree
I'm trying to code the extra-trees classifier algorithm proposed here but I'm stuck on the part where i have to select a threshold Ath at random according to a distribution N(µ,σ), where µ and σ are respectively the mean and standard deviation of the pixel values at a position (k,l)....

Force the left to right order of nodes in graphviz?

graph,graphviz,decision-tree,graph-visualization
I want to draw a decision tree chart using graphviz. The graph I want to draw looks like this: I am using the following dot language: graph a { A [shape=box; label="A"] B [shape=box; label="B"] al [shape=none; label="0"] bl [shape=none; label="1"] br [shape=none; label="0"] A -- al [label="0"]; A --...

Can I manually create an RWeka decision (Recursive Partitioning) tree?

r,weka,decision-tree
I have constructed a J48 decision tree using RWeka. I would like to compare its performance to a decision tree described an existing (externally computed) decision tree. I'm new to RWeka and I'm having trouble manually creating an RWeka decision tree. Ideally, I would like to show the two side-by-side...

Create a vector of accuracy measures in CARET for repeated hold-out samples

r,decision-tree,caret
I want to create a vector of accuracy measures from decision trees created by repeating holdout samples (same size). I am trying this in CARET. library(caret) ctrl <- trainControl(method = "LGOCV", repeats = 60, p=0.66) mod1 <- train(Species ~ ., data = iris, method = "rpart", trControl = ctrl) My...

Finding a corresponding leaf node for each data point in a decision tree (scikit-learn)

python,machine-learning,scikit-learn,decision-tree
I'm using decision tree classifier from the scikit-learn package in python 3.4, and I want to get the corresponding leaf node id for each of my input data point. For example, my input might look like this: array([[ 5.1, 3.5, 1.4, 0.2], [ 4.9, 3. , 1.4, 0.2], [ 4.7,...

Dynamic if-then Code

java,decision-tree
I'm using Decision Tree algorithm and I get if-then rules (returned as text) for example: if(Parameter1 > 10) then if(Parameter2< 5) then do A else do B else do C I want to use these rules in order to get decisions for few items: item(Parameter1, Parameter2) example: item1(15, 5), item2(10,...

Why do I get this error below while using the Cubist package in R?

r,regression,decision-tree,non-linear-regression
I have some personal dataset. So I split it into variable to predict and predictors. Following is the syntax: library(Cubist) str(A) 'data.frame': 6038 obs. of 3 variables: $ ads_return_count : num 7 10 10 4 10 10 10 10 10 9 ... $ actual_cpc : num 0.0678 0.3888 0.2947 0.0179...

Entropy of pure split caculated to NaN

matlab,decision-tree,entropy
I have written a function to calculate entropy of a vector where each element represents number of elements of a class. function x = Entropy(a) t = sum(a); t = repmat(t, [1, size(a, 2)]); x = sum(-a./t .* log2(a./t)); end e.g: a = [4 0], then entropy = -(0/4)*log2(0/4) -...

Is it possible to add duration and easing to window.scrollTo?

javascript,interactive,scrollto,decision-tree,easing
I'm using Bill Miller's Interactive Decision guide code. http://www.guiideas.com/2013/09/interactive-decision-guide.html To scroll new questions into view at the bottom of the page he uses window.scrollTo //scroll code to bring next question into view var qNextPos = $('#qTable' + qNext).offset(); var qNextTop = qNextPos.top; var qNextHigh = $('#qTable' + qNext).height(); var qNextBot...

Use a DecisionTree throughout whole lifetime in C# .NET application

c#,.net,asp.net-mvc,decision-tree
I have a webapplication in development. I'm thinking about using a DecisionTree to analyse certain things. The DecisionTree has to be created and will be used in different fases. E.g. in a controller something will be compared/checked and a certain view will be returned. Do I create this DecisionTree at...

What splitting criterion does Random Tree in Weka 3.7.11 use for numerical attributes?

machine-learning,weka,random-forest,decision-tree
I'm using RandomForest from Weka 3.7.11 which in turn is bagging Weka's RandomTree. My input attributes are numerical and the output attribute(label) is also numerical. When training the RandomTree, K attributes are chosen at random for each node of the tree. Several splits based on those attributes are attempted and...

Decision Tree visualization error in rattle Error: the FUN argument to prp is not a function

r,visualization,decision-tree,rattle
I created a decision tree in rattle for the in-built wine dataset. The output is shown below Summary of the Decision Tree model for Classification (built using 'rpart'): library(rpart) library(rattle) n= 124 node), split, n, loss, yval, (yprob) * denotes terminal node 1) root 124 73 2 (0.30645161 0.41129032 0.28225806)...

When does rules based classifier outperforms decision trees?

machine-learning,classification,data-mining,decision-tree
Suppose I have an option to choose between making a Decision Tree and a rule based classifier, which one should I choose? Assuming that the rule based classifier has mutually exclusive and exhaustive set of rules, then which one is preferable? Is there some specific advantage/drawback of rule based classifier...

Pruning rule based classification tree (PART algorithm)

r,statistics,classification,decision-tree,rweka
I am using PART algorithm in R (via package RWeka) for multi-class classification. Target attribute is time bucket in which an invoice will be paid by customer (like 7-15 days, 15-30 days etc). I am using following code for fitting and predicting from the model : fit <- PART(DELAY_CLASS ~...

Same decision tree, different results

weka,decision-tree,j48
I work on a machine learning application and use Weka for testing, comparison classification algorithms etc. After the test operations on Weka, I determined to use J48 decision tree. I parsed the pruned tree which Weka had produced and implemented it as if-then format in C. However, if I tested...

Splitting List into sublists based on unique values

java,arraylist,java-8,classification,decision-tree
I have a list of lists: List<ArrayList<String>> D = new ArrayList<>(); When it's populated, it might look like: ["A", "B", "Y"] ["C", "D", "Y"] ["A", "D", "N"] I want to split the list of lists into partitions based on the unique attribute values (let's say index 1). So the attribute...