0000000001139633
AUTHOR
Pia Zurita
PDF reweighting in the Hessian matrix approach
We introduce the Hessian reweighting of parton distribution functions (PDFs). Similarly to the better-known Bayesian methods, its purpose is to address the compatibility of new data and the quantitative modifications they induce within an existing set of PDFs. By construction, the method discussed here applies to the PDF fits that carried out a Hessian error analysis using a non-zero tolerance $\Delta\chi^2$. The principle is validated by considering a simple, transparent example. We are also able to establish an agreement with the Bayesian technique provided that the tolerance criterion is appropriately accounted for and that a purely exponential Bayesian likelihood is assumed. As a practi…
Can we fit nuclear PDFs with the high-x CLAS data?
AbstractNuclear parton distribution functions (nuclear PDFs) are non-perturbative objects that encode the partonic behaviour of bound nucleons. To avoid potential higher-twist contributions, the data probing the high-x end of nuclear PDFs are sometimes left out from the global extractions despite their potential to constrain the fit parameters. In the present work we focus on the kinematic corner covered by the new high-x data measured by the CLAS/JLab collaboration. By using the Hessian re-weighting technique, we are able to quantitatively test the compatibility of these data with globally analyzed nuclear PDFs and explore the expected impact on the valence-quark distributions at high x. W…
Extracting qˆ in event-by-event hydrodynamics and the centrality/energy puzzle
In our analysis, we combine event-by-event hydrodynamics, within the EKRT formulation, with jet quenching -ASW Quenching Weights- to obtain high- for charged particles at RHIC and LHC energies for different centralities. By defining a K-factor that quantifies the departure of from an ideal estimate, , we fit the single-inclusive experimental data for charged particles. This K-factor is larger at RHIC than at the LHC but, surprisingly, it is almost independent of the centrality of the collision. peerReviewed
Hessian PDF reweighting meets the Bayesian methods
We discuss the Hessian PDF reweighting - a technique intended to estimate the effects that new measurements have on a set of PDFs. The method stems straightforwardly from considering new data in a usual $\chi^2$-fit and it naturally incorporates also non-zero values for the tolerance, $\Delta\chi^2>1$. In comparison to the contemporary Bayesian reweighting techniques, there is no need to generate large ensembles of PDF Monte-Carlo replicas, and the observables need to be evaluated only with the central and the error sets of the original PDFs. In spite of the apparently rather different methodologies, we find that the Hessian and the Bayesian techniques are actually equivalent if the $\Delta…
The impact of the LHC nuclear program on nPDFs
Volume: 612 The proton-lead and lead-lead runs at the LHC are providing an enormous amount of data sensitive to the nuclear modifications of the initial state. The measurements explore a region of phase space not probed by previous experiments opening a possibility to test and hopefully, also improve the current knowledge of nuclear parton densities. In this talk, we discuss to what extent the present quantitative results for the charge asymmetry in electroweak boson production show sensitivity to the nuclear parton distributions. Peer reviewed
Re-weighting at the LHC: the p–Pb data impact
Abstract In this work we present selected results of a comprehensive analysis of the medium modifications in proton-lead LHC Run I data, and discuss the implications on different sets of nuclear parton densities. We find that the nuclear environment has a non-negligible relevance on the experimental results. We incorporate the information from Run I into the current nuclear densities and provide novel sets of nPDFs that will be useful for future predictions.
An analysis of the impact of LHC Run I proton–lead data on nuclear parton densities
We report on an analysis of the impact of available experimental data on hard processes in proton-lead collisions during Run I at the Large Hadron Collider on nuclear modifications of parton distribution functions. Our analysis is restricted to the EPS09 and DSSZ global fits. The measurements that we consider comprise production of massive gauge bosons, jets, charged hadrons and pions. This is the first time a study of nuclear PDFs includes this number of different observables. The goal of the paper is twofold: i) checking the description of the data by nPDFs, as well as the relevance of these nuclear effects, in a quantitative manner; ii) testing the constraining power of these data in eve…
Bayesian PDF reweighting meets the Hessian methods
Volume: 273 New data coming from the LHC experiments have a potential to extend the current knowledge of parton distribution functions (PDFs). As a short cut to the cumbersome and time consuming task of performing a new PDF fit, re weighting methods have been proposed. In this talk, we introduce the so-called Hessian re-weighting, valid for PDF fits that carried out a Hessian error analysis, and compare it with the better-known Bayesian methods. We determine the existence of an agreement between the two approaches, and illustrate this using the inclusive jet production at the LHC. Peer reviewed
Extracting $\hat{q}$ in event-by-event hydrodynamics and the centrality/energy puzzle
In our analysis, we combine event-by-event hydrodynamics, within the EKRT formulation, with jet quenching -ASW Quenching Weights- to obtain high-$p_T$ $R_{\rm AA}$ for charged particles at RHIC and LHC energies for different centralities. By defining a $K$-factor that quantifies the departure of $\hat{q}$ from an ideal estimate, $K = \hat{q}/(2\epsilon^{3/4})$, we fit the single-inclusive experimental data for charged particles. This $K$-factor is larger at RHIC than at the LHC but, surprisingly, it is almost independent of the centrality of the collision.