Only institutional emails for accounts, please. Pour demander un compte, indiquez une adresse mail institutionnelle.
Bienvenue sur le catalogue des événements organisés par le LAL
Pour accéder aux catégories Direction et CHS, consulter les news.
Bienvenue sur le catalogue des événements organisés par le LAL
Pour accéder aux catégories Direction et CHS, consulter les news.
Filtres
export iCal
J'ai lu et compris le texte cidessus
J'ai lu et compris le texte cidessus
Télécharger cet évènement:
Fichier calendrier de l'évènement
In order to enable an iCal export link, your account needs to have a key created. This key enables other applications to access data from within Indico even when you are neither using nor logged into the Indico system yourself with the link provided. Once created, you can manage your key at any time by going to 'My Profile' and looking under the tab entitled 'HTTP API'. Further information about HTTP API keys can be found in the Indico documentation.
In conjunction with a having a key associated with your account, to have the possibility of exporting private event information necessitates the creation of a persistent key. This new key is also associated with your account and whilst it is active the data which can be obtained through using this key can be obtained by anyone in possession of the link provided. Due to this reason, it is extremely important that you keep links generated with this key private and for your use only. If you think someone else may have acquired access to a link using this key in the future, you must immediately remove it from 'My Profile' under the 'HTTP API' tab and generate a new key before regenerating iCalendar links.
Lien permanent pour contenu public uniquement:
Lien permanent pour contenu public et protégé:
Please use CTRL + C to copy this URL
Ordre du jour détaillé
Plus
HEPML workshop at NIPS14
samedi 13 décembre 2014
de
à
(Europe/Paris)
à Palais des Congrès de Montréal ( Level 5, room 511 c )
à Palais des Congrès de Montréal ( Level 5, room 511 c )
Description 
The web site of the event: 
Go to day


08:30  10:00
Session 1

08:30
Welcome
15'
Intervenant: Balázs Kégl (LAL) 
08:45
HEP&ML and the HiggsML challenge
35'
We first describe the HiggsML challenge (the problem of optimizing classifiers for discovery significance, the setup of the challenge, the results, and some analysis of the outcome). In the second part we outline some of the application themes of machine learning in highenergy physics.
Intervenant: Balázs Kégl (LAL) Documents: Slides 
09:20
Embedding ML in Classical Statistical tests used in HEP (invited talk)
40'
I will review the ways that machine learning is typically used in particle physics, some recent advancements, and future directions. In particular, I will focus on the integration of machine learning and classical statistical procedures. These considerations motivate a novel construction that is a hybrid of machine learning algorithms and more traditional likelihood methods.
Intervenant: Kyle Cranmer (New York University) Documents: Transparents

08:30
Welcome
15'

10:00  10:30
Coffee break

10:30  12:10
Session 2

10:30
Presentation of the winner of the HiggsML challenge
20'
We describe the winning solution of the HiggsML challenge, the issues related to the evaluation metric and reliable assessment of model performance. Finally, we take a stab at predicting how to achieve larger improvements.
Intervenant: Gábor Melis Documents: Slides 
10:50
Presentation of the runner up of the HiggsML challenge
20'
High Energy Physics provides a challenging data domain with data that is highly structured, but also very noisy. I will present what I have learned analyzing this data for the HiggsML challenge, focusing on methods that are able to effectively search through a high dimensional model space while also achieving good statistical efficiency. In addition, I will discuss the role of the physicist in modelling this type of data, and I will talk about robustly applying our findings to real (not simulated) HEP data.
Intervenant: Tim Salimans Documents: Slides 
11:10
Presentation of the winner of the HEP meets ML prize
20'
In this talk, I will describe how we use principle of gradient boosting method to construct simple and effective regression trees functions for Higgs Boson detection. We take a functional space optimization framework that jointly optimize the training objective and simplicity of functions learnt. I talk about how the objective could be clearly related to the tree searching, pruning and leave weight estimation. Finally I will discuss how the framework could be modularized, to provide interface for adding physics domain knowledge into the learning algorithm.
Intervenant: Tianqi Chen 
11:30
Real time data analysis at the LHC : present and future
40'
The large hadron collider (LHC), which collides protons at an energy of 14 TeV (for nonphysicists, each beam of protons carries roughly the energy of a TGV train going at full speed), produces hundreds of exabytes of data per year, making it one of the largest sources of data in the world today. At present it is not possible to even transfer most of this data from the four main particle detectors at the LHC to "offline" data facilities, much less to permanently store it for future processing. For this reason the LHC detectors are equipped with realtime analysis systems, called triggers, which process this volume of data and select the most interesting protonproton collisions. The LHC experiment triggers reduce the data produced by the LHC by between 1/1000 and 1/10000, to tens of petabytes per year, allowing its economical storage and further analysis. The bulk of this datareduction is performed by custom electronics which ignores most of the data in its decision making, and is therefore unable to exploit the most powerful known data analysis strategies developed by e.g. the machine learning community. In this talk I will cover the present status of realtime data analysis at the LHC, before explaining why the future upgrades of the LHC experiments will increase the volume of data which can be sent off the detector and into offtheshelf data processing facilities (such as CPU or GPU farms) to tens of exabytes per year. This development will simultaneously enable a vast expansion of the physics programme of the LHC's detectors, and make it mandatory to develop and implement a new generation of realtime multivariate analysis tools in order to fully exploit this new potential of the LHC. I will explain what work is ongoing in this direction and hopefully motivate why more effort is needed in the coming years.
Intervenant: Vava Gligorov

10:30
Presentation of the winner of the HiggsML challenge
20'

15:00  16:30
Session 3

15:00
Machine Learning for UltraHighEnergy Physics (invited talk)
40'
I will describe the computational and machine learning challenges of the CRAYFIS project: a distributed cosmic ray telescope consisting of consumer smartphones and geared for the detection of ultrahighenergy cosmic rays. For more info: http://crayfis.ps.uci.edu/
Intervenant: Daniel Whiteson 
15:40
Weighted Classification Cascades for Optimizing Discovery Significance in the HiggsML Challenge
20'
We introduce a minorizationmaximization approach to optimizing common measures of discovery significance in high energy physics. The approach alternates between solving a weighted binary classification problem and updating class weights in a simple, closedform manner. Moreover, an argument based on convex duality shows that an improvement in weighted classification error on any round yields a commensurate improvement in discovery significance. We complement our derivation with experimental results from the 2014 Higgs boson machine learning challenge.
Intervenant: Lester Mackey 
16:00
Consistent optimization of AMS by logistic loss minimization
20'
In this paper, we theoretically justify an approach popular among participants of the Higgs Boson Machine Learning Challenge to optimize approximate median significance (AMS). The approach is based on the following twostage procedure. First, a realvalued function is learned by minimizing a surrogate loss for binary classification, such as logistic loss, on the training sample. Then, a threshold is tuned on a separate validation sample, by direct optimization of AMS. We show that the regret of the resulting (thresholded) classifier measured with respect to the squared AMS, is upperbounded by the regret of the underlying realvalued function measured with respect to the logistic loss. Hence, we prove that minimizing logistic surrogate is a consistent method of optimizing AMS.
Intervenant: Wojciech Kotlowski

15:00
Machine Learning for UltraHighEnergy Physics (invited talk)
40'

16:30  17:00
Coffee break

17:00  18:30
Session 4

17:00
Ensemble of maximied Weighted AUC models for the maximization of the median discovery significance
20'
From May 12th 2014 to September 15th 2014 took place the Higgs Boson Machine Learning Challenge. Its goal was to explore machine learning methods to improve the discovery significance of the ATLAS experiment. This talk describes the preprocessing, training and results of our model, that finished in 9th position among the solutions of 1785 teams.
Intervenant: Roberto Diaz Morales (University Carlos III de Madrid) Documents: Paper Slides Video 
17:20
Deep Learning In HighEnergy Physics (invited talk)
40'
We will provide a brief overview of the challenges and opportunities facing machine learning in the natural sciences, from physics to biology, and then focus on the application of deep learning methods to problems in highenergy physics. In particular we will describe the results obtained on three different problems (Higgs boson detection, Supersymmetry, and Higgs boson decay).
Intervenant: Pierre Baldi Documents: Slides 
18:00
Panel discussion
30'

17:00
Ensemble of maximied Weighted AUC models for the maximization of the median discovery significance
20'

08:30  10:00
Session 1
Partager cette page
Réseaux sociaux
Calendriers