From: Comparative analysis of regression algorithms for drug response prediction using GDSC dataset
Library/Algorithm | Abbreviation | Description | Category |
---|---|---|---|
sklearn/KNeighborsRegressor | KNN | Get an output which is the average property values of k nearest neighbors [8] | Miscellaneous |
sklearn/RandomForestRegressor | RFR | Concept of regression trees by exploiting the power of computers to simultaneously generate hundreds of regression trees[9] | Ensemble |
sklearn/SupportVectorRegressor | SVR | Characterized by the use of kernels, sparse solution, and VC control of the margin and the number of support vectors[10] | Kernel-based |
sklearn/DecisionTreeRegressor | DTR | Generates a decision tree from given instances [11] | Tree- or rule-base |
sklearn/AdaBoostRegressor | ADA | Consists of several decision tree regressors as a weak learner[12] | Ensemble |
sklearn/GradientBoostingRegressor | GBR | Integrated model with higher performance and better stability [13] | Ensemble |
lightgbm/LGBMRegressor | LGBM | Framework for implementing Gradient Boosting Decision Tree [14] | Ensemble |
xgboost/XGBRegressor | XGBR | Scalable machine learning system for tree boosting [15] | Ensemble |
sklearn/MLPRegressor | MLP | Feed-forward neural networks to deal with non-linear regression models [16] | Artificail neural network |
sklearn/GaussianProcessRegressor | GPR | Nonparametric method that belongs to the Bayesian statistics family [17] | Miscellaneous |
sklearn/Ridge | RGE | Designed to find the linear hyperplane that approximates the data labels well [18] | Regularized |
sklearn/Lasso | LAS | Based on the concept of minimizing the standard mean squared error penalized by the sum of absolute values of the regression coefficients[19] | Regularized |
sklearn/ElasticNet | EN | Form of regularized optimization for linear regression [20] | Regularized |