knnB              package:MLInterfaces              R Documentation

_A_n _i_n_t_e_r_f_a_c_e _t_o _v_a_r_i_o_u_s _m_a_c_h_i_n_e _l_e_a_r_n_i_n_g _m_e_t_h_o_d_s _f_o_r _E_x_p_r_e_s_s_i_o_n_S_e_t_s

_D_e_s_c_r_i_p_t_i_o_n:

     This document describes a family of wrappers of calls to machine
     learning classifiers distributed through various R packages. This
     particular document concerns the classifiers for which
     training-vs-test set application makes sense.

     For example, 'knnB' is a wrapper for a call to 'knn' for objects
     of class 'ExpressionSet'. These interfaces, of the form '[f]B'
     provide a common calling sequence and common return value for
     machine learning code in function '[f]'.

     For details on the additional arguments that may be passed to any
     covered machine learning function 'f', check the manual page for
     that function. This will require loading the package in which 'f'
     is found.

_U_s_a_g_e:

     knnB(exprObj, classifLab, trainInd, k = 1, l = 1, prob = TRUE,
       use.all = TRUE, metric = "euclidean") 
     #
     # for such functions as nnetB, use the same first three
     # parameters, and then add optional parameters from the nnet API
     #

_A_r_g_u_m_e_n_t_s:

 exprObj: An instance of the 'exprset' class. 

classifLab: The name of the phenotype variable to use for
          classification. 

trainInd: integer vector: Which elements are the training set. 

       k: The number of nearest neighbors. 

       l: See 'knn' for a complete description. 

    prob: See 'knn' for a complete description. 

 use.all: See 'knn' for a complete description. 

  metric: See 'knn' for a complete description. 

_D_e_t_a_i_l_s:

     Note: As of version 1.13.18 of MLInterfaces, randomForestB is no
     longer supported.  'MLearn' with 'randomForestI' learnerSpec is
     supported, please use it.

     See 'knn' for a complete description of parameters to and details
     of the k-nearest neighbor procedure in the 'class' package.

     For other interfaces, such as ldaB, nnetB, rpartB, gbmB, and so
     on, see the usage note above and also see the man pages for those
     functions.  For each of these functions you will need to attach
     the appropriate package in order to examine the man page.

     The 'MLearn' interface is a more unified approach but is still
     maturing.

_V_a_l_u_e:

     An object of class '"classifOutput"' This class unifies the
     representation of results of machine learning algorithms
     implemented by different designers.

_A_u_t_h_o_r(_s):

     Jess Mar, VJ Carey <stvjc@channing.harvard.edu>

_S_e_e _A_l_s_o:

     'xval' for information on how to obtain various cross-validated
     fits, and 'MLearn' for a less fragmented implementation of the
     interface.

_E_x_a_m_p_l_e_s:

     # access and trim an ExpressionSet
     library(golubEsets)
     data(Golub_Merge)
     smallG <- Golub_Merge[1:60,]
     # set a PRNG seed for reproducibilitiy
     set.seed(1234) # needed for nnet initialization
     # now run the classifiers
     knnB( smallG, "ALL.AML", 1:40 )
     nnetB( smallG, "ALL.AML", 1:40, size=5, decay=.01 )
     lvq1B( smallG, "ALL.AML", 1:40 )
     naiveBayesB( smallG, "ALL.AML", 1:40 )
     svmB( smallG, "ALL.AML", 1:40 )
     baggingB( smallG, "ALL.AML", 1:40 )
     ipredknnB( smallG, "ALL.AML", 1:40 )
     sldaB( smallG, "ALL.AML", 1:40 )
     ldaB( smallG, "ALL.AML", 1:40 )
     qdaB( smallG[1:10,], "ALL.AML", 1:40 )
     pamrB( smallG, "ALL.AML", 1:40 )
     rpartB( smallG, "ALL.AML", 1:35 )
     gbmB( smallG, "ALL.AML", 1:40, n.minobsinnode=3 , n.trees=6000)
     stat.diag.daB( smallG, "ALL.AML", 1:40 )

