Skip to content

Commit

Permalink
Merge pull request #49 from ML4SCI/patch-2
Browse files Browse the repository at this point in the history
Update proposal_SYMBA.md
  • Loading branch information
ereinha authored Feb 20, 2024
2 parents a4a1a39 + 4b60daa commit a2407c1
Showing 1 changed file with 23 additions and 10 deletions.
33 changes: 23 additions & 10 deletions _gsocproposals/2024/proposal_SYMBA.md
Original file line number Diff line number Diff line change
@@ -1,36 +1,49 @@
---
title: Symbolic empirical representation of squared amplitudes in high-energy physics
title: Transformer Models for Symbolic Calculations of Squared Amplitudes in HEP
layout: gsoc_proposal
project: SYMBA
year: 2023
year: 2024
organization:
- Alabama
- FSU
- Kentucky
- FAU
- QU
---

## Description

One of the most important physical quantities in particle physics is the cross section, or a probability that a particular process takes place in the interaction of elementary particles. Its measure provides a testable link between theory and experiment. It is obtained theoretically mainly by calculating the squared amplitude (matrix M).
One of the most important physical quantities in particle physics is the cross section, or a probability that a particular process takes place in the interaction of elementary particles. Its measure provides a testable link between theory and experiment. It is obtained theoretically mainly by calculating the squared amplitude. The approach we use in this project is to treat the amplitude and squared amplitude as mathematical symbolic expressions and use language-translation models to map from the amplitude to squared-amplitude.

## Duration

Total project length: 175/350 hours.

## Task ideas and expected results
* Apply symbolic machine learning techniques to predict the squared amplitudes and cross section for high-energy physics
* Develop various transformer-based models on sequence-to-sequence tasks
* Benchmark different models on simulated physics datasets of various complexity and sequence lengths to find the best model
* Integrate with the SymbaHEP pipeline

## Requirements
Python, C++ and some experience in Machine Learning sequence models.
Significant experience with Transformer machine learning models in Python (preferably using pytorch).

<!-- ## Test
Please use this [link](https://docs.google.com/document/d/1eMtRPR-nH2NyituMBIDAZdmcCkZF2TyUFQp6zg-z5pA/edit) to access the test for this project. -->
## Difficulty Level
Advanced

<!-- ## Test
Please use this [link](https://docs.google.com/document/d/1eMtRPR-nH2NyituMBIDAZdmcCkZF2TyUFQp6zg-z5pA) to access the test for this project. -->

## Mentors
* [Abdulhakim Alnuqaydan](mailto:[email protected]) (University of Kentucky)
* [Eric Reinhardt](mailto:[email protected]) (University of Alabama)
* [Abdulhakim Alnuqaydan](mailto:[email protected]) (Qassim University)
* [Sergei Gleyzer](mailto:[email protected]) (University of Alabama)
* [Harrison Prosper](mailto:[email protected]) (Florida State University)
* [Eric Reinhardt](mailto:[email protected]) (University of Alabama)
* [Nobuchika Okada](mailto:[email protected]) (University of Alabama)
* [Marco Knipfer](mailto:[email protected]) (University of Erlangen-Nürnberg)

Please **DO NOT** contact mentors directly by email. Instead, please email [[email protected]](mailto:[email protected]) with Project Title and **include your CV** and **test results**. The mentors will then get in touch with you.

## Links
* [Paper 1](https://ml4physicalsciences.github.io/2023/files/NeurIPS_ML4PS_2023_183.pdf)
* [Paper 2](https://iopscience.iop.org/article/10.1088/2632-2153/acb2b2)
* [Poster 1](https://nips.cc/media/PosterPDFs/NeurIPS%202023/76219.png)
* [Blog post 1](https://medium.com/@neerajanandfirst/my-journey-to-google-summer-of-code-2023-with-ml4sci-8822ce64464a)

0 comments on commit a2407c1

Please sign in to comment.