Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Enhancing Interval Type-2 Fuzzy Logic Systems: Learning for Precision and Prediction Intervals (2404.12802v1)

Published 19 Apr 2024 in cs.LG and cs.AI

Abstract: In this paper, we tackle the task of generating Prediction Intervals (PIs) in high-risk scenarios by proposing enhancements for learning Interval Type-2 (IT2) Fuzzy Logic Systems (FLSs) to address their learning challenges. In this context, we first provide extra design flexibility to the Karnik-Mendel (KM) and Nie-Tan (NT) center of sets calculation methods to increase their flexibility for generating PIs. These enhancements increase the flexibility of KM in the defuzzification stage while the NT in the fuzzification stage. To address the large-scale learning challenge, we transform the IT2-FLS's constraint learning problem into an unconstrained form via parameterization tricks, enabling the direct application of deep learning optimizers. To address the curse of dimensionality issue, we expand the High-Dimensional Takagi-Sugeno-Kang (HTSK) method proposed for type-1 FLS to IT2-FLSs, resulting in the HTSK2 approach. Additionally, we introduce a framework to learn the enhanced IT2-FLS with a dual focus, aiming for high precision and PI generation. Through exhaustive statistical results, we reveal that HTSK2 effectively addresses the dimensionality challenge, while the enhanced KM and NT methods improved learning and enhanced uncertainty quantification performances of IT2-FLSs.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (18)
  1. T. Pearce, A. Brintrup, M. Zaki, and A. Neely, “High-quality prediction intervals for deep learning: A distribution-free, ensembled approach,” in Int. Conf. Mach. Learn., vol. 80, 2018, pp. 4075–4084.
  2. M. Abdar et al., “A review of uncertainty quantification in deep learning: Techniques, applications and challenges,” Inf. Fusion., vol. 76, pp. 243–297, 2021.
  3. A. Beke and T. Kumbasar, “More than accuracy: A composite learning framework for interval type-2 fuzzy logic systems,” IEEE Trans. Fuzzy Syst., vol. 31, no. 3, pp. 734–744, 2023.
  4. K. Shihabudheen and G. N. Pillai, “Recent advances in neuro-fuzzy system: A survey,” Knowl Based Syst., vol. 152, pp. 136–162, 2018.
  5. H. Han, Z. Liu, H. Liu, J. Qiao, and C. P. Chen, “Type-2 fuzzy broad learning system,” IEEE Trans. Cybern., vol. 52, no. 10, pp. 10 352–10 363, 2021.
  6. P. V. de Campos Souza, “Fuzzy neural networks and neuro-fuzzy networks: A review the main techniques and applications used in the literature,” Applied soft computing, vol. 92, p. 106275, 2020.
  7. M. Almaraash, M. Abdulrahim, and H. Hagras, “A life-long learning XAI metaheuristic-based type-2 fuzzy system for solar radiation modelling,” IEEE Trans. Fuzzy Syst., 2023.
  8. K. Wiktorowicz, “T2RFIS: type-2 regression-based fuzzy inference system,” Neural Comput. Appl., vol. 35, no. 27, pp. 20 299–20 317, 2023.
  9. R. A. Aliev et al., “Type-2 fuzzy neural networks with fuzzy clustering and differential evolution optimization,” Inf. Sci., vol. 181, no. 9, pp. 1591–1608, 2011.
  10. G. Xue, J. Wang, K. Zhang, and N. R. Pal, “High-dimensional fuzzy inference systems,” IEEE Trans. Syst. Man Cybern. Syst., 2023.
  11. Y. Cui, Y. Xu, R. Peng, and D. Wu, “Layer normalization for TSK fuzzy system optimization in regression problems,” IEEE Trans. Fuzzy Syst., vol. 31, no. 1, pp. 254–264, 2023.
  12. Y. Zheng, Z. Xu, and X. Wang, “The fusion of deep learning and fuzzy systems: A state-of-the-art survey,” IEEE Trans. Fuzzy Syst., vol. 30, no. 8, pp. 2783–2799, 2021.
  13. Y. Cui, D. Wu, and Y. Xu, “Curse of dimensionality for TSK fuzzy neural networks: Explanation and solutions,” in Proc. Int. Jt. Conf. Neural Netw., 2021, pp. 1–8.
  14. E. van Krieken, E. Acar, and F. van Harmelen, “Analyzing differentiable fuzzy logic operators,” Artificial Intelligence, vol. 302, p. 103602, 2022.
  15. T. Kumbasar, “Revisiting Karnik–Mendel algorithms in the framework of linear fractional programming,” Int. J. Approx. Reason., vol. 82, pp. 1–21, 2017.
  16. M. Nie and W. W. Tan, “Towards an efficient type-reduction method for interval type-2 fuzzy logic systems,” in IEEE Int. Conf. Fuzzy Syst., 2008, pp. 1425–1432.
  17. T. A. Runkler, C. Chen, and R. John, “Type reduction operators for interval type–2 defuzzification,” Inf. Sci., vol. 467, pp. 464–476, 2018.
  18. H. Quan, D. Srinivasan, and A. Khosravi, “Short-term load and wind power forecasting using neural network-based prediction intervals,” IEEE Trans. Neur. Net. Learn. Syst., vol. 25, no. 2, pp. 303–315, 2014.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets