谷歌浏览器插件
订阅小程序
在清言上使用

A Lorentz-Equivariant Transformer for All of the LHC

Johann Brehmer, Víctor Bresó, Pim de Haan,Tilman Plehn,Huilin Qu, Jonas Spinner,Jesse Thaler

arxiv(2024)

引用 0|浏览0
暂无评分
摘要
We show that the Lorentz-Equivariant Geometric Algebra Transformer (L-GATr) yields state-of-the-art performance for a wide range of machine learning tasks at the Large Hadron Collider. L-GATr represents data in a geometric algebra over space-time and is equivariant under Lorentz transformations. The underlying architecture is a versatile and scalable transformer, which is able to break symmetries if needed. We demonstrate the power of L-GATr for amplitude regression and jet classification, and then benchmark it as the first Lorentz-equivariant generative network. For all three LHC tasks, we find significant improvements over previous architectures.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要