Operational data is always huge. A preprocessing step is needed to prepare such data for the analytical process so the process will be fast. One way is by choosing the most effective features and removing the others. Feature selection algorithms (FSAs) can do that with a variety of accuracy depending on both the nature of the data and the algorithm itself. This inspires researchers to keep on developing new FSAs to give higher accuracies than the existing ones. Moreover, FSAs are essential for reducing the cost and effort of developing information system applications. Merging multiple methodologies may improve the dimensionality reduction rate retaining sensible accuracy. This research proposed a hybrid feature selection algorithm based on ScC and forward selection methods (ScCFS). ScC is based on stability and correlation while forward selection is based on Random Forest (RF) and Information Gain (IG). A lowered subset generated by ScC is fed to the forward selection method which uses the IG as a decision criterion for selecting the attribute to split the node of the RF to generate the optimal reduct. ScCFS was compared to other known FSAs in terms of accuracy, AUC, and F-score using several classification algorithms and several datasets. Results showed that the ScCFS excels other FSAs employed for all classifiers in terms of accuracy except FLM where it comes in second place. This proves that ScCFS is the pioneer in generating the reduced dataset with remaining high accuracies for the classifiers used.