knnB              package:MLInterfaces              R Documentation

_A_n _i_n_t_e_r_f_a_c_e _t_o _v_a_r_i_o_u_s _m_a_c_h_i_n_e _l_e_a_r_n_i_n_g _m_e_t_h_o_d_s _f_o_r _e_x_p_r_S_e_t_s

_D_e_s_c_r_i_p_t_i_o_n:

     This document describes a family of wrappers of calls to machine
     learning classifiers distributed through various R packages. This
     particular document concerns the classifiers for which
     training-vs-test set application makes sense.

     For example, 'knnB' is a wrapper for a call to 'knn' for objects
     of class 'exprSet'. These interfaces, of the form '[f]B' provide a
     common calling sequence and common return value for machine
     learning code in function '[f]'.

     For details on the additional arguments that may be passed to any
     covered machine learning function 'f', check the manual page for
     that function. This will require loading the package in which 'f'
     is found.

_U_s_a_g_e:

     knnB(exprObj, classifLab, trainInd, k = 1, l = 1, prob = TRUE,
       use.all = TRUE, metric = "euclidean") 

_A_r_g_u_m_e_n_t_s:

 exprObj: An instance of the 'exprset' class. 

classifLab: A vector of class labels. 

trainInd: integer vector: Which elements are the training set. 

       k: The number of nearest neighbors. 

       l: See 'knn' for a complete description. 

    prob: See 'knn' for a complete description. 

 use.all: See 'knn' for a complete description. 

  metric: See 'knn' for a complete description. 

_D_e_t_a_i_l_s:

     See 'knn' for a complete description of parameters to and details
     of the k-nearest neighbor procedure in the 'class' package.

_V_a_l_u_e:

     An object of class 'classifOutput-class'.

_A_u_t_h_o_r(_s):

     Jess Mar, VJ Carey <stvjc@channing.harvard.edu>

_S_e_e _A_l_s_o:

     'ldaB'

_E_x_a_m_p_l_e_s:

     # access and trim an exprSet
     library(golubEsets)
     data(golubMerge)
     smallG <- golubMerge[1:60,]
     # set a PRNG seed for reproducibilitiy
     set.seed(1234) # needed for nnet initialization
     # now run the classifiers
     knnB( smallG, "ALL.AML", 1:40 )
     nnetB( smallG, "ALL.AML", 1:40, size=5, decay=.01 )
     lvq1B( smallG, "ALL.AML", 1:40 )
     naiveBayesB( smallG, "ALL.AML", 1:40 )
     svmB( smallG, "ALL.AML", 1:40 )
     baggingB( smallG, "ALL.AML", 1:40 )
     ipredknnB( smallG, "ALL.AML", 1:40 )
     sldaB( smallG, "ALL.AML", 1:40 )
     ldaB( smallG, "ALL.AML", 1:40 )
     qdaB( smallG[1:10,], "ALL.AML", 1:40 )
     pamrB( smallG, "ALL.AML", 1:40 )
     rpartB( smallG, "ALL.AML", 1:35 )
     randomForestB( smallG, "ALL.AML", 1:35 )
     gbmB( smallG, "ALL.AML", 1:40, n.minobsinnode=3 , n.trees=6000)
     if (require(LogitBoost)) logitboostB( smallG, "ALL.AML", 1:40, 200 ) # summarize won't work with polych
     stat.diag.daB( smallG, "ALL.AML", 1:40 )

