OPTIMIZATION ALGORITHMS FOR ENHANCING HIGH DIMENSIONAL BIOMEDICAL DATA PROCESSING EFFICIENCY

Authors

  • Rifat Chowdhury Master of Business Administration, University of North Alabama, Florence, AL, USA Author
  • Jinnat Ara Master of Science in Applied Mathematics, Noakhali Science and Technology University, Bangladesh Author

DOI:

https://doi.org/10.63125/2zg6x055

Keywords:

Optimization Algorithms, High-Dimensional Biomedical Data, Computational Efficiency, Scalability, Stability

Abstract

High-dimensional biomedical data processing places substantial computational demands on optimization algorithms due to extreme feature dimensionality, sparsity, missingness, and heterogeneous task structures. This study quantitatively evaluated how optimization algorithm families influenced processing efficiency under fixed task-quality constraints across representative biomedical workflows, including predictive modeling, feature selection, and reconstruction tasks. A controlled benchmarking design was applied to 12 high-dimensional datasets, producing 1,680 algorithm executions across 14 algorithm variants and 7 algorithm families, with 10 repeated runs per condition. Processing efficiency was operationalized using multiple indicators, including wall-clock time-to-target, peak memory usage, iterations or epochs to convergence, throughput, and numerical stability outcomes. Descriptive results showed that time-to-target runtime ranged from 2.8 s to 1,420.6 s, with a median of 96.4 s, while peak memory usage ranged from 0.9 GB to 21.6 GB, with a median of 6.3 GB. Constraint failure occurred in 6.8% of runs, and numerical error events were observed in 2.1% of executions. Mixed-effects regression analyses demonstrated statistically significant differences across optimization algorithm families for runtime, memory usage, and convergence behavior after controlling for feature dimensionality, sparsity ratio, missingness rate, and task type. Scaling analysis indicated that increasing feature dimensionality from 10,000 to 250,000 features increased median runtime from 42.3 s to 188.9 s and median peak memory from 3.1 GB to 9.7 GB under constant performance constraints. Reliability analysis supported the internal consistency of the composite efficiency framework, with an overall Cronbach’s alpha of 0.89. Overall, the findings demonstrated that optimization algorithm choice produced statistically measurable differences in efficiency, stability, and scalability in high-dimensional biomedical data processing, highlighting the importance of structured, constraint-based benchmarking for robust comparative evaluation.

Downloads

Published

2022-12-25

How to Cite

Rifat Chowdhury, & Jinnat Ara. (2022). OPTIMIZATION ALGORITHMS FOR ENHANCING HIGH DIMENSIONAL BIOMEDICAL DATA PROCESSING EFFICIENCY. Review of Applied Science and Technology , 1(04), 98–145. https://doi.org/10.63125/2zg6x055

Cited By: