Papers
Topics
Authors
Recent
Search
2000 character limit reached

Machine-learned models for magnetic materials

Published 29 Dec 2023 in cond-mat.mtrl-sci and cs.LG | (2401.00072v2)

Abstract: We present a general framework for modeling power magnetic materials characteristics using deep neural networks. Magnetic materials represented by multidimensional characteristics (that mimic measurements) are used to train the neural autoencoder model in an unsupervised manner. The encoder is trying to predict the material parameters of a theoretical model, which is then used in a decoder part. The decoder, using the predicted parameters, reconstructs the input characteristics. The neural model is trained to capture a synthetically generated set of characteristics that can cover a broad range of material behaviors, leading to a model that can generalize on the underlying physics rather than just optimize the model parameters for a single measurement. After setting up the model, we prove its usefulness in the complex problem of modeling magnetic materials in the frequency and current (out-of-linear range) domains simultaneously, for which we use measured characteristics obtained for frequency up to $10$ MHz and H-field up to saturation.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (32)
  1. S. Kirkpatrick, C. D. Gelatt Jr, and M. P. Vecchi, “Optimization by simulated annealing,” science, vol. 220, no. 4598, pp. 671–680, 1983.
  2. M. R. Bonyadi and Z. Michalewicz, “Particle swarm optimization for single objective continuous space problems: A review,” Evolutionary Computation, vol. 25, no. 1, pp. 1–54, 2017.
  3. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of ICNN’95 - International Conference on Neural Networks, vol. 4, 1995, pp. 1942–1948 vol.4.
  4. J. Chai, H. Zeng, A. Li, and E. W. Ngai, “Deep learning in computer vision: A critical review of emerging techniques and application scenarios,” Machine Learning with Applications, vol. 6, p. 100134, 2021. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S2666827021000670
  5. M. Alam, M. Samad, L. Vidyaratne, A. Glandon, and K. Iftekharuddin, “Survey on deep neural networks in speech and vision systems,” Neurocomputing, vol. 417, pp. 302–321, 2020. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0925231220311619
  6. B. Selvaratnam and R. T. Koodali, “Machine learning in experimental materials chemistry,” Catalysis Today, vol. 371, pp. 77–84, 2021.
  7. P. M. Voyles, “Informatics and data science in materials microscopy,” Current Opinion in Solid State and Materials Science, vol. 21, no. 3, pp. 141–158, 2017.
  8. K. T. Schütt, M. Gastegger, A. Tkatchenko, K.-R. Müller, and R. J. Maurer, “Unifying machine learning and quantum chemistry with a deep neural network for molecular wavefunctions,” Nature communications, vol. 10, no. 1, p. 5024, 2019.
  9. C. Chen, W. Ye, Y. Zuo, C. Zheng, and S. P. Ong, “Graph networks as a universal machine learning framework for molecules and crystals,” Chemistry of Materials, vol. 31, no. 9, pp. 3564–3572, 2019.
  10. W. Li, P. Chen, B. Xiong, G. Liu, S. Dou, Y. Zhan, Z. Zhu, T. Chu, Y. Li, and W. Ma, “Deep learning modeling strategy for material science: from natural materials to metamaterials,” Journal of Physics: Materials, vol. 5, no. 1, p. 014003, mar 2022. [Online]. Available: https://dx.doi.org/10.1088/2515-7639/ac5914
  11. A. S. Fuhr and B. G. Sumpter, “Deep generative models for materials discovery and machine learning-accelerated innovation,” Frontiers in Materials, vol. 9, 2022. [Online]. Available: https://www.frontiersin.org/articles/10.3389/fmats.2022.865270
  12. K. Choudhary, B. DeCost, C. Chen, A. Jain, F. Tavazza, R. Cohn, C. W. Park, A. Choudhary, A. Agrawal, S. J. L. Billinge, E. Holm, S. P. Ong, and C. Wolverton, “Recent advances and applications of deep learning methods in materials science,” npj Computational Materials, vol. 8, no. 1, p. 59, 2022. [Online]. Available: https://doi.org/10.1038/s41524-022-00734-6
  13. C. Bonatti and D. Mohr, “One for all: Universal material model based on minimal state-space neural networks,” Science Advances, vol. 7, no. 26, p. eabf3658, 2021. [Online]. Available: https://www.science.org/doi/abs/10.1126/sciadv.abf3658
  14. J. Damewood, J. Karaguesian, J. R. Lunger, A. R. Tan, M. Xie, J. Peng, and R. Gómez-Bombarelli, “Representations of materials for machine learning,” Annual Review of Materials Research, vol. 53, 2023.
  15. M. Szewczyk, K. Kutorasiński, J. Pawłowski, W. Piasecki, and M. Florkowski, “Advanced modeling of magnetic cores for damping of high-frequency power system transients,” IEEE Transactions on Power Delivery, vol. 31, no. 5, pp. 2431–2439, 2016.
  16. D. C. Jiles and D. L. Atherton, “Theory of ferromagnetic hysteresis (invited),” Journal of Applied Physics, vol. 55, no. 6, pp. 2115–2120, 03 1984. [Online]. Available: https://doi.org/10.1063/1.333582
  17. F. Liorzou, B. Phelps, and D. Atherton, “Macroscopic models of magnetization,” IEEE Transactions on Magnetics, vol. 36, no. 2, pp. 418–428, 2000.
  18. L. Perkkiö, B. Upadhaya, A. Hannukainen, and P. Rasilo, “Stable adaptive method to solve fem coupled with jiles–atherton hysteresis model,” IEEE Transactions on Magnetics, vol. 54, no. 2, pp. 1–8, 2018.
  19. K. Hoffmann, J. P. Assumpção Bastos, J. V. Leite, N. Sadowski, and F. Barbosa, “An accurate vector jiles-atherton model for improving the fem convergence,” in 2016 IEEE Conference on Electromagnetic Field Computation (CEFC), 2016, pp. 1–1.
  20. H. Dommel and W. Meyer, “Computation of electromagnetic transients,” Proceedings of the IEEE, vol. 62, no. 7, pp. 983–993, 1974.
  21. H. W. Dommel and W. S. Meyer, “Computation of electromagnetic transients,” Proceedings of the IEEE, vol. 62, no. 7, pp. 983–993, 1974.
  22. K. Kutorasiński, J. Pawłowski, P. Leszczyński, and M. Szewczyk, “Nonlinear magnetic material modeling for circuit simulation,” xXx, 2023.
  23. J. Pawłowski, K. Kutorasiński, and M. Szewczyk, “Multifrequency nonlinear model of magnetic material with artificial intelligence optimization,” Scientific reports, vol. 12, no. 1, pp. 19 784–19 784, 2022.
  24. M. Ferch, “Application overview of nanocrystalline inductive components in today’s power electronic systems,” in Proc. Soft Magn. Mater. Conf, no. A1-01, 2013.
  25. U. Riechert, M. Bösch, M. Szewczyk, W. Piasecki, J. Smajic, A. Shoory, S. Burow, and S. Tenbohlen, “Mitigation of very fast transient overvoltages in gas insulated uhv substations,” Proc. CIGRE, 2012.
  26. K. Kutorasiński, M. Szewczyk, M. Molas, and J. Pawłowski, “Measuring impedance frequency characteristics of magnetic rings with dc-bias current,” ISA Transactions, 2023. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0019057823003968
  27. M. D. Zeiler and R. Fergus, “Visualizing and understanding convolutional networks,” in European conference on computer vision.   Springer, 2014, pp. 818–833.
  28. F. Yu and V. Koltun, “Multi-scale context aggregation by dilated convolutions,” in International Conference on Learning Representations (ICLR), May 2016.
  29. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, L. u. Kaiser, and I. Polosukhin, “Attention is all you need,” in Advances in Neural Information Processing Systems, I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, Eds., vol. 30.   Curran Associates, Inc., 2017. [Online]. Available: https://proceedings.neurips.cc/paper_files/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf
  30. F. Schroff, D. Kalenichenko, and J. Philbin, “Facenet: A unified embedding for face recognition and clustering,” 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun 2015. [Online]. Available: http://dx.doi.org/10.1109/CVPR.2015.7298682
  31. A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga et al., “Pytorch: An imperative style, high-performance deep learning library,” Advances in neural information processing systems, vol. 32, 2019.
  32. “Product specification for inductive components: M-676 core datasheet,” MAGNETEC GmbH, Jan, 2020.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.