Soft Computing
  • Soft Computing
  • Introduction
  • Requirement
    • Data Set
  • Applications
    • Matlab
      • Perceptron
        • Little Red Riding Hood
          • Output
      • SVM
        • Code
        • Execution
      • TreeDecision
        • Code
        • Execution
      • Kmeans - Kmedoids
        • Code
        • Execution
      • Dimensionality Reduction
        • Principal component analysis (PAC)
          • Code
          • Execution
    • Python
      • Setup
Powered by GitBook
On this page

Was this helpful?

  1. Applications
  2. Matlab
  3. TreeDecision

Code

This example help us to understand the decision that are taken in base of some data base and printing on a Tree.

This first method allow us the determine which data base we want to study and define the characteristic so it could be process by the algorithm.

We define the classes of each data, then we evaluate the data with the classes, and the we prune the tree to get a better result, and finally print the error that we get.

% --- Executes on button press in pushbutton1.
function pushbutton1_Callback(hObject, eventdata, handles)
% hObject    handle to pushbutton1 (see GCBO)
% eventdata  reserved - to be defined in a future version of MATLAB
% handles    structure with handles and user data (see GUIDATA)
iris = get(handles.irisdata,'Value');
ecoli = get(handles.ecoli,'Value');
glass = get(handles.glass,'Value');

if iris == 1
    load fisheriris;
    %Entrenamiento arbol
    t = classregtree(meas, species, 'names', {'SL', 'SW', 'PL', 'PW'});
    view (t);
    %Prueba
    tresultado = eval(t,meas);
    cm = confusionmat(species,tresultado);
    N = sum(cm(:));
    err = (N-sum(diag(cm)) )/N
    set(handles.error,'String',err);
    %Prune arbol
    t2 = prune(t, 'level', 1);
    view (t2);
    %predicci?n
    inst = [4.90000000000000,2.40000000000000,3.30000000000000,1];
    prediction = eval(t2,inst)
else
    if ecoli == 1
        load svmecoli;
    %Entrenamiento arbol
    t = classregtree(meas, species, 'names', {'mcg', 'gvh', 'lip', 'chg', 'acc', 'alm1' ,'alm2'});
    view (t);
    %Prueba
    tresultado = eval(t,meas);
    cm = confusionmat(species,tresultado);
    N = sum(cm(:));
    err = (N-sum(diag(cm)) )/N
    set(handles.error,'String',err);
    %Prune arbol
    t2 = prune(t, 'level', 1);
    view (t2);
    %predicci?n
    inst = [0.490000000000000,0.290000000000000,0.480000000000000,0.500000000000000,0.560000000000000,0.240000000000000,0.350000000000000];
    prediction = eval(t2,inst)

    else
        if glass == 1         
     load svmglass1;
    %Entrenamiento arbol
    t = classregtree(meas, species, 'names', {'Ri', 'Na', 'Mg', 'Al', 'Si','K','Ca','Ba','Fe'});
    view (t);
    %Prueba
    tresultado = eval(t,meas);
    cm = confusionmat(species,tresultado);
    N = sum(cm(:));
    err = (N-sum(diag(cm)) )/N
    set(handles.error,'String',err);
    %Prune arbol
    t2 = prune(t, 'level', 1);
    view (t2);
    %predicci?n
    inst = [1.51613000000000,13.8800000000000,1.78000000000000,1.79000000000000,73.1000000000000,0,8.67000000000000,0.760000000000000,0];
    prediction = eval(t2,inst)
        end;
    end;

end;
PreviousTreeDecisionNextExecution

Last updated 5 years ago

Was this helpful?