The process of piecing together the details of a neutrino interaction from raw data is called reconstruction, and consumes up to millions of hours of CPU time per year. As plans are drawn up for future detectors, the volume of information to sort through will only increase.
Traditionally, reconstruction algorithms are written by hand starting from the first principles of the physics of the interaction. However, it is becoming difficult for physicists to keep up with the sheer amount of information that modern particle detectors produce. This results in algorithms that do not leverage the full capabilities of the instrument.
Machine learning is a subfield of computer science that concerns teaching computers how to learn algorithms. Instead of writing the algorithm of interest by hand, examples of the correctly performed task are given to the computer and it learns the necessary algorithm to perform it.
Deep learning is a rapidly growing area of machine learning that has shown the ability to learn tasks involving large amounts of information. Deep neural networks have shown super-human performance in image recognition, and were involved in the well publicized victory of an AI algorithm over Go champion Lee Sedol.
Physicists have a long history of involvement in machine learning, but have only just begun to explore the potential for deep learning in experimental research. We hope to use this new tool to search for tau neutrinos in IceCube, and assist the reconstruction of events in MicroBooNE.
Read an essay I wrote about the use of deep learning in particle physics.