Papers
Topics
Authors
Recent
2000 character limit reached

Granular-ball Computing: Principles & Applications

Updated 27 December 2025
  • Granular-ball Computing (GBC) is a computational paradigm that represents data as adaptive hyperspherical regions (granular balls) to enhance efficiency and robustness.
  • It optimizes data coverage and quality via justifiable granularity, balancing a minimal number of balls with high purity and specificity.
  • GBC has been effectively applied in clustering, classification, deep learning robustness, and graph coarsening, yielding significant computational savings and improved performance.

Granular-ball Computing (GBC) is a computational paradigm that operationalizes the principle of multi-granularity by representing data as adaptive, coverage-maximizing, hyperspherical regions—granular balls (GBs)—rather than points. In GBC, learning and inference occur on these information granules, enabling substantial reductions in computational cost and noise sensitivity, while preserving or enhancing representation fidelity and interpretability across tasks such as clustering, classification, feature selection, deep learning robustness, rough set modeling, and graph coarsening (Xia et al., 2023, Jia et al., 16 May 2025, Xia et al., 2022, Xia et al., 2022, Xie et al., 2023, Xia et al., 30 Jan 2025).

1. Foundational Concepts and Formal Definitions

GBC is grounded in the "Global-first" cognitive mechanism for information processing, whereby data is initially represented at coarse granularity and adaptively refined only where higher resolution is justified (Xia et al., 2023). A granular ball in mm-dimensional space is defined by a center cX=(1/X)xXxc_X = (1/|X|)\sum_{x \in X} x and (typically) an average radius RXave=(1/X)xXxcX2R_X^{ave} = (1/|X|)\sum_{x \in X} \|x-c_X\|_2 or a maximal radius RXmax=maxxXxcX2R_X^{max} = \max_{x \in X} \|x-c_X\|_2. The set of balls Φ={ΩX1,,ΩXt}\Phi = \{\Omega_{X_1},\dots,\Omega_{X_t}\} forms a covering of the data universe U={x1,,xn}RmU = \{x_1,\dots,x_n\}\subset \mathbb{R}^m, with Φn|\Phi| \ll n. Each GB may be further annotated by properties such as purity (proportion of the majority label) or other task-specific quality metrics (Xia et al., 2022, Xia et al., 2022).

The generation of granular balls aims to optimize three antagonistic objectives: (1) maximize data coverage; (2) minimize the number of balls (promoting coarseness); (3) guarantee quality (often via a purity threshold or a justifiable granularity function) (Xia et al., 2023, Jia et al., 16 May 2025).

2. Principles of Granular-Ball Generation and Justifiable Granularity

High-quality GB construction is determined by the Principle of Justifiable Granularity (POJG), which dictates that a granule must simultaneously maximize coverage and specificity. The quality function is Q(ΩX)=QC(ΩX)QS(ΩX)Q(\Omega_X) = Q_C(\Omega_X) \cdot Q_S(\Omega_X), where coverage QCQ_C is a non-decreasing function (commonly cardinality) and specificity QSQ_S is a non-increasing function (commonly eγRXavee^{-\gamma R_X^{ave}} with a granularity parameter γ\gamma) (Jia et al., 16 May 2025). GB-POJG+ introduces a penalized objective to prevent over-granulation:

J(Φ,λ)=i=1tQ(ΩXi)λt,J^*(\Phi, \lambda) = \sum_{i=1}^t Q(\Omega_{X_i}) - \lambda t,

where λ0\lambda \geq 0 penalizes the number of GBs, balancing descriptive power and computational cost. The splitting of balls is controlled by penalized quality and abnormal ball detection based on radius and sample statistics, ensuring that boundary, noisy, or outlier balls are identified and recursively refined (Jia et al., 16 May 2025, Xie et al., 2023).

Adaptive selection of the granularity parameter γ\gamma is realized by solving a set of split-gain inequalities across candidate nodes in the GB tree. This process searches for the coarsest admissible granularity allowing further refinement where hyperspherical models break down, which is essential for complex manifold or non-uniformly distributed data (Jia et al., 16 May 2025).

3. Algorithmic Frameworks and Computational Properties

Algorithmically, GBC methods proceed through the following general stages:

  • Initialization: Begin with a single ball or an initial coarse partition (e.g., via kk-means or farthest-first heuristics).
  • Recursive Splitting: Balls are split if they violate a quality threshold (purity, compactness, or POJG-based quality). Splitting can leverage deterministic farthest-point, k-division, or attention-based approaches for efficiency and stability (Xia et al., 2022, Xie et al., 2023).
  • Overlap and Outlier Handling: Overlap between heterogeneous balls is eliminated to suppress boundary ambiguity, and single-sample or abnormally large balls are recognized as outliers and either pruned or split (Jia et al., 16 May 2025, Xie et al., 2023).
  • Termination and Refinement: Splitting halts when all balls satisfy quality constraints or a minimal allowable size. In clustering contexts, further refinement via adjacency, k-NN graphs, or spanning tree constructions may be performed (Xia et al., 2022, Xie et al., 2023).

The computational complexity of granular-ball generation is typically near-linear O(n)O(n) or O(nlogn)O(n \log n), given that Φn|\Phi| \ll n and splitting halts at sub-polynomial sizes in nn (Xia et al., 2022, Xia et al., 2023, Jia et al., 16 May 2025). Downstream tasks—classification, clustering, graph operations—operate on Φ|\Phi| objects, dramatically reducing cost relative to point-wise methods.

4. Applications: Clustering, Robust Learning, Feature Selection, and More

GBC has been systematically deployed in diverse contexts:

  • Clustering: GBC-based algorithms, including GBC (Xia et al., 2022), GBCT (Xia et al., 2024), LGBQPC (Jia et al., 16 May 2025), GBSK (Chen et al., 28 Sep 2025), and GBMST (Xie et al., 2023), partition data by merging or graph analysis on GBs, efficiently recovering arbitrary shapes and manifolds, robust to noise and density heterogeneity.
  • Classification: GB-based k-NN (Xie et al., 2023) and SVM methods (Xia et al., 2022, Zhao et al., 2024) operate on ball centers and radii, aggregating label information, thereby achieving resilience to outliers and reducing input size; purity-motivated splitting and harmonic distance corrections further enhance accuracy and efficiency.
  • Fuzzy and Rough Set Theory: GBFRS (Xia et al., 30 Jan 2025) and granular-ball rough set frameworks (Xia et al., 2022) generalize Pawlak and neighborhood rough sets to multi-granularity, improving interpretability and robustness. Weighted dependence and adaptive neighborhood boundaries facilitate robust feature selection.
  • Deep Learning Robustness: In deep convolutional networks, GBC modules cluster feature representations, discarding or down-weighting suspected noisy samples; gradients are propagated via centroid-based aggregation, resulting in significant improvements under label noise (Dai et al., 2022, Dai et al., 2024).
  • Graph Processing: GBGC (Xia et al., 24 Jun 2025) coarsens graphs by adaptively generating GBs as supernodes, yielding substantial computational savings while preserving spectral structure.
  • Feature Selection and Knowledge Transfer: GBC is used as a representation base for continual feature selection and knowledge transfer, detecting open-set classes, and enabling efficient incremental feature subset optimization (Cao et al., 2024).

5. Empirical Performance and Comparative Evaluation

Extensive benchmarks confirm the validity of GBC approaches. In clustering, methods such as LGBQPC (Jia et al., 16 May 2025) outperform density-peak, DBSCAN, spectral, and prior GB-based algorithms over 40 heterogeneous datasets, excelling in NMI and ARI. Experiments on datasets up to n=60000n=60\,000 validate both scalability and top-ranked accuracy. GBC clustering is robust to its key parameters: penalty λ\lambda controls granularity, and neighborhood size kk impacts graph construction; both show stable or flat regions in performance curves over wide ranges (Jia et al., 16 May 2025).

In classification, GB-based algorithms routinely match or surpass classical kNN, SVM, and fuzzy SVM on UCI and real-world data, especially under label noise. Efficiency gains are up to 100×\times in training and prediction time, and robustness to noise is demonstrated across noise rates up to 50\% (Xie et al., 2023, Xia et al., 2022, Zhao et al., 2024). In deep learning, GBC layers integrated with CNNs under random and human noise consistently reduce effective noise in training batches and drive absolute accuracy gains of 2–5\% in challenging regimes (Dai et al., 2022, Dai et al., 2024).

In graph coarsening, GBGC achieves 10–100×\times speedups and equal or improved graph classification accuracy compared to competing spectral and kernel-based coarsening methods (Xia et al., 24 Jun 2025). For open-world continual feature selection, GBC mechanisms enable 10×\times speedups and maintenance or improvement of F1-scores and core metrics as new classes and features arrive (Cao et al., 2024).

6. Interpretability, Robustness, and Limitations

GBC's multi-granular representations produce intermediate models (ball hierarchies, cover trees, adjacency graphs) that are transparent and interpretable. Splitting and merging operations mirror human-perceived shapes and boundaries. Theoretical margin amplification occurs in SVM-style classifiers due to the explicit account of ball radii, conferring provable insensitivity to perturbations not exceeding the granule scale (Xia et al., 2023, Xia et al., 2022).

Robustness emerges from three mechanisms: (1) majority voting inside GBs suppresses point-wise label flips; (2) adaptive refinement ensures finer balls at class boundaries; (3) outlier and abnormal-ball detection partitions or isolates noise and contamination (Jia et al., 16 May 2025, Xie et al., 2023). When all balls degenerate to singletons, GBC models recover point-based limits, guaranteeing theoretical consistency.

Limitations include the need to select or adapt granularity thresholds, potential collapse in high-dimensional low-sample regimes, and intricacies in the direct extension to non-Euclidean or non-vectorial data. Approaches for fully automatic granularity adaptation, attribute-weighted ball construction, and integration with deep, end-to-end learning frameworks are current research frontiers (Xia et al., 2023, Xia et al., 2024).

7. Outlook and Continued Development

The GBC framework is actively under expansion across AI subfields. Ongoing work encompasses meta-learning for automated parameter selection, kernelization for manifold and graph data, streaming and distributed implementations, and further embedding in neural network architectures for interpretable representation learning (Xia et al., 2023). As its empirical base grows, GBC is poised to become a standard paradigm in efficient, robust, and interpretable computation for large-scale and complex-structured data (Jia et al., 16 May 2025, Xia et al., 2023).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (16)

Whiteboard

Topic to Video (Beta)

Follow Topic

Get notified by email when new papers are published related to Granular-ball Computing (GBC).