ECB-ART-54988
Sci Rep
2026 May 05; doi: 10.1038/s41598-026-51422-0.
Show Gene links
Show Anatomy links
Transformer based breast cancer classification from histopathological images using sequential residual recurrent multiscale attention network.
???displayArticle.abstract???
Breast cancer is a leading cause of mortality among women globally, highlighting the need for accurate and robust diagnostic systems. This study presents a breast cancer classification based on transformer network (BrCTransNet), a novel deep learning-based framework for breast cancer classification from histopathological images. Initially, input images are preprocessed using an Adaptive Gaussian Bilateral (AGB) filter to reduce noise while preserving essential features. Feature extraction is performed using a Sequential Residual Recurrent Multiscale Attention Network (S2RMANet), which captures rich multi-scale spatial and long-range contextual features. The extracted features are then classified using Optimized Polarized Self-Attention-based Transformer (OPATransNet), a Transformer-based model that integrates a Polarized Self-Attention Module to independently learn spatial and channel-wise dependencies. To optimize the loss function, a Chebyshev StarFish Optimization (ChStF) algorithm is used for tuning OPATransNet hyperparameter, combining Chaotic Chebyshev Mapping with the StarFish Optimization Algorithm (SFOA) to enhance performance. Experimental results using the Breast Histopathology Images dataset demonstrate that the proposed BrCTransNet achieves a classification accuracy of 98.97%, recall of 98.91%, F1-score of 98.98%, specificity of 98.36%, and MSE of 0.0103 outperforming existing models. These findings confirm the superiority of the proposed model in terms of accuracy, robustness, and convergence speed, establishing its potential as a reliable tool for breast cancer diagnosis.
???displayArticle.pubmedLink??? 42086722
???displayArticle.link??? Sci Rep