Leveraging Eigenvalues and Eigenvectors for Principal Component Analysis: A Deep Dive Into Dimensionality Reduction Techniques
Main Article Content
Abstract
It gauges if PCA has a significant effect on a learning machine's performance as regards its performance. The applicability of PCA lowers huge high-dimensional data to significant features that retain all critical variance. This would add up to efficiency and higher accuracy in modeling prediction outcomes. Using PCA is better than not using PCA on all those models, namely, Logistic Regression, Decision Trees, and Support Vector Machines, with improved performance in modeling outcomes. It yields higher accuracy, precision, recall, and F1-score than the baseline on high dimensions. Noise and redundancy reduction will help the models focus on significant features, thus enhancing their generalization capabilities. The results affirm the importance of PCA in optimizing the machine learning workflows, especially in handling large and complex datasets, as well as improving model interpretability and computational efficiency.