Leveraging Eigenvalues and Eigenvectors for Principal Component Analysis: A Deep Dive Into Dimensionality Reduction Techniques

Main Article Content

Krishana
Dr. Vinod Kumar

Abstract

It gauges if PCA has a significant effect on a learning machine's performance as regards its performance. The applicability of PCA lowers huge high-dimensional data to significant features that retain all critical variance. This would add up to efficiency and higher accuracy in modeling prediction outcomes. Using PCA is better than not using PCA on all those models, namely, Logistic Regression, Decision Trees, and Support Vector Machines, with improved performance in modeling outcomes. It yields higher accuracy, precision, recall, and F1-score than the baseline on high dimensions. Noise and redundancy reduction will help the models focus on significant features, thus enhancing their generalization capabilities. The results affirm the importance of PCA in optimizing the machine learning workflows, especially in handling large and complex datasets, as well as improving model interpretability and computational efficiency.

Downloads

Download data is not yet available.

Article Details

How to Cite
Krishana, & Dr. Vinod Kumar. (2024). Leveraging Eigenvalues and Eigenvectors for Principal Component Analysis: A Deep Dive Into Dimensionality Reduction Techniques. Educational Administration: Theory and Practice, 30(6), 5008–5012. https://doi.org/10.53555/kuey.v30i6.9097
Section
Articles
Author Biographies

Krishana

Research Scholar, Department of Mathematics, Om Sterling Global University, Hisar, Haryana

Dr. Vinod Kumar

Professor, Department of Mathematics, Om Sterling Global University, Hisar, Haryana