|
In January, Dinesh Ramakrishnan and I published a paper in Frontiers in Artificial Intelligence presenting a novel type of machine learning model called SineKAN (Sinusoidal Kolmogorov-Arnold Network). We showed that it can potentially outperform the most commonly used base models in neural networks (the Linear/Dense/MLP models) in tasks like image classification.
Dinesh and I just published our second paper on the topic in MDPI Mathematics, where we showed that these models work empirically and also presented a constructive proof showing that our previous work developed a completely new 1D function approximation series similar to but provably different than the Fourier Series and that the two-layer SineKAN models are also approximators for multivariable functions. The approximation proof meets the same approximation error standards as the Universal Approximation Theorem for traditional neural networks. I was also the co-author along with Victor Baules on two conference papers at the Machine Learning and the Physical Sciences Workshop at the international AI conference NeurIPS (one published last year and one accepted this year). We showed the physics applications of SineKAN for computing squared amplitudes of standard model physics processes. I co-authored another journal paper headed by Mahmud Shamim (currently in review at a journal) on arXiV showing that these models outperform several other traditional neural networks at the task of modelling ground state wave functions of quantum many-body stystems. Thank you to Dinesh, Mahmud, and Victor!! https://www.frontiersin.org/journals/artificial-intelligence/articles/10.3389/frai.2024.1462952/full https://www.mdpi.com/2227-7390/13/19/3157 https://ml4physicalsciences.github.io/2024/files/NeurIPS_ML4PS_2024_118.pdf https://arxiv.org/html/2506.01891v1
0 Comments
Leave a Reply. |
ArchivesCategories |
RSS Feed