Every year, a committee of experts sits down with a tough job to do: from among all ICREA publications, they must find a handful that stand out from all the others. This is indeed a challenge. The debates are sometimes heated and always difficult but, in the end, a shortlist of 24 publications is produced. No prize is awarded, and the only additional acknowledge is the honour of being chosen and highlighted by ICREA. Each piece has something unique about it, whether it be a particularly elegant solution, the huge impact it has in the media or the sheer fascination it generates as a truly new idea. For whatever the reason, these are the best of the best and, as such, we are proud to share them here.


Format: yyyy
  • Experiments and modelling in quest for high fusion performance (2018)

    Mantsinen, Mervi Johanna (BSC-CNS)

    view details

    Experiments and modelling in quest for high fusion performance

    Our recent research results have advanced the understanding of how to reach high fusion performance in fusion experiments in the Joint European Torus (JET), in preparation of ITER. JET is the largest experimental fusion device in operation today and the only one capable of using the fusion reactor relevant fuel mixture of heavy hydrogen isotopes deuterium (D) and tritium (T).

    Our research has focused on the impact of neutral beam injection (NBI) and ion cyclotron resonance frequency (ICRF) heating on the fusion yield. These heating methods will be used in ITER to heat the fuel to high temperatures required for fusion. The main ICRF scheme that we have studied both experimentally and computationally at JET is heating of minority hydrogen ions in a D plasma with D beam ions. Apart from the D plasma scenario, we have also investigated the deuterium-tritium (DT) plasma scenario through an extrapolation of D high-performance discharges.

    Our experimental and modelling results have allowed us to draw several important conclusions regarding the role of auxiliary heating on the fusion plasma performance, which helps us to improve the fusion plasma performance in the forthcoming campaign with D-T fuel mixture at JET. One of the main goals for this flagship campaign is to achieve world-record fusion performance for a duration of more than 5 s.

  • Holography, Hydrodynamics and the Quark-Gluon Plasma (2018)

    Mateos, David (UB)

    view details

    Holography, Hydrodynamics and the Quark-Gluon Plasma

    At low energies quarks and gluons are confined inside the protons and neutrons contained in the atoms that we and the things around us are made of. However, at a temperature of one trillion degrees (a hundred thousand times the temperature at the centre of the Sun) thermal fluctuations are so violent that quarks and gluons become liberated and give rise to a new form of matter known as "Quark-Gluon Plasma" (QGP). The QGP filled the Universe about one microsecond after the Big Bang and it has been be recreated on Earth in so-called Heavy Ion Collision experiments (HIC). 

    One of the main discoveries of these experiments is that the QGP behaves as an almost-perfect fluid that is well described by hydrodynamics. This is crucial because hydrodynamics is the bridge that allows us to connect theory with experiment. We have investigated the applicability of hydrodynamics in the regime that will be explored by HIC experiments over the next decade. Since this is difficult with conventional methods, we have used a string-theoretical tool known as "holography", which maps the properties of matter in our four-dimensional world to those of … gravity in five dimensions! 

    We have discovered that the formulation of hydrodynamics that is almost universally used in hydrodynamic codes, the so-called Muller-Israel-Stewart (MIS) formulation, may not capture correctly the physics of the QGP in the new regime, which could potentially jeopardize our interpretation of the next generation of experiments. 

    Fortunately, by formulating the problem in terms of five-dimensional gravity, holography also suggests a solution that is currently under investigation.

  • Global trait–environment relationships of plant communities (2018)

    Mencuccini, Maurizio (CREAF)

    view details

    Global trait–environment relationships of plant communities

    Terrestrial ecosystems (e.g., grasslands, forests) provide a variety of services to human societies, for example climate regulation, provision of water, energy and materials such as timber or animal fodder. These ecosystem ‘functions’ depend on the species (grasses, shrubs, trees) that make up these ecosystems. It is now known that it is not so much the taxonomy of the species that matter, but rather how their leaves, stems and roots are made, i.e., their ‘functional’ attributes, or traits. When a grassland or a forest community is composed of many species, it is the community-level average of the traits of all the component species that defines the level of the observed ecosystem function. Key questions are a) to what extent these community-level trait compositions differ globally, and b) whether environmental drivers at local and/or global scale affect community-level trait values. Here, we perform a global, plot-level analysis of trait–environment relationships, using a database with more than 1.1 million vegetation plots and 26,632 plant species with trait information. We find two main community-level trait axes that capture half of the global trait variation across these 1.1 million vegetation plots. These two axes represent plant stature and leaf traits controlling resource acquisitiveness (light, nutrients, etc), similar to prior results at the scale of individual species. We found that climate and soil conditions at the global scale exert only a weak control on community-level trait averages. Our results indicate that, at a fine spatial grain, macro-environmental drivers are much less important for functional trait composition than has been previously assumed. Instead, trait combinations seem to be predominantly filtered by local-scale factors such as disturbance, fine-scale soil conditions, niche partitioning and biotic interactions.

  • Architecting Graphene Micromotors: A Simple Paper-Based Manufacturing Technology (2018)

    Merkoçi, Arben (ICN2)

    view details

    Architecting Graphene Micromotors: A Simple Paper-Based Manufacturing Technology

    The group led by Prof. Merkoçi developed a graphene oxide rolled-up tube production process that uses wax-printed membranes for the fabrication of on-demand engineered micromotors at different levels of oxidation, thickness, and lateral dimensions. Using this technology the graphene oxide rolled-up tubes have shown magnetic and catalytic movement within the addition of magnetic nanoparticles or sputtered platinum in the surface of graphene-oxide-modified wax-printed membranes prior to the scrolling process. As a proof of concept, the authors have shown that the as-prepared catalytic graphene oxide rolled-up micromotors are successfully exploited for oil removal from water. This micromotor production technology relies on an easy, operator-friendly, fast, and cost-efficient wax-printed paper-based method and may offer a myriad of hybrid devices and applications. The developed technology may open the way of simple micromotors fabrication using other 2D materials for various applications.

  • The most precise measurement of the (dark) matter distribution in the universe (2018)

    Miquel Pascual, Ramon (IFAE)

    view details

    The most precise measurement of the (dark) matter distribution in the universe

    There is overwhelming evidence that most of the matter in the universe is in a "dark" form that neither emits nor blocks light, and is therefore invisible to even the largest telescopes. While the detailed nature of this "dark matter" remains unknown, its gravitational interactions can be used to detect it and study its spatial distribution as a function of cosmic time, which, in turn, depends on the nature of the mysterious "dark energy" responsible for the current accelerated expansion of the universe. Particularly relevant is the so-called "weak gravitational lensing" effect, in which the observed shapes of distant galaxies are slightly modified by the gravitational pull of the masses between them and us. Then, the statistical properties of a large set of images of distant galaxies can be studied to determine the distribution of the intervening matter, which is mostly dark.

    The Dark Energy Survey (DES) is an international collaboration of 350 scientists from 28 institutions in 8 countries that is surveying an eighth of the sky using DECam, a 570-megapixel camera installed at the Blanco 4-meter telescope in the Cerro Tololo Inter-American Observatory in Chile. Looking at the data from the first season of observations (2013/14), DES has measured the shapes of about 35 million distant galaxies. Combining the measurement of the correlations between the shapes of these distant galaxies (sources) with the correlations in the positions of closer galaxies (lenses) and with the cross-correlations between the shapes of the sources and the positions of the lenses (a measurement led by IFAE), DES has produced the most precise determination of the clustering of the (mostly dark) matter (fig. 2). This is the first direct measurement that is as precise as those coming from the cosmic microwave background radiation, which is sensitive to the tiny inhomogeneities when the universe was 380000 years old, which can then be extrapolated to predict the large inhomogeneities we see today. The agreement between this extrapolation and the DES direct measurement provides a stringent test of the current cosmological model, with a cosmological constant as the dark energy.

  • A five-continent, 100,000-participant experiment to test Einstein's theory of local realism (2018)

    Mitchell, Morgan W. (ICFO)
    Acín Dal Maschio, Antonio (ICFO)

    view details

    A five-continent, 100,000-participant experiment to test Einstein's theory of local realism

    A Bell test is a randomized trial that compares experimental observations against the philosophical worldview of local realism, in which the properties of the physical world are independent of our observation of them and no signal travels faster than light. A Bell test requires spatially distributed entanglement, fast and high-efficiency detection and unpredictable measurement settings. Although technology can satisfy the first two of these requirements, the use of physical devices to choose settings in a Bell test requires seemingly circular arguments that make assumptions about the same physics one aims to test. Bell himself noted this weakness in using physical setting choices and argued that human ‘free will’ could be used rigorously to ensure unpredictability in Bell tests. We led and coordinated a set of local-realism tests using human choices, valid without assumptions about predictability in physics. We recruited about 100,000 human participants to play an online video game that incentivizes fast, sustained input of unpredictable selections and illustrates Bell-test methodology. The participants generated 97,347,490 binary choices, which were directed via a scalable web platform to 12 laboratories on five continents, where 13 experiments tested local realism using photons, single atoms, atomic ensembles and superconducting devices. Over a 12-hour period on 30 November 2016, participants worldwide provided a sustained data flow of over 1,000 bits per second to the experiments, which used different human-generated data to choose each measurement setting. The observed correlations strongly contradict local realism and other realistic positions in bipartite and tripartite scenarios. Project outcomes include closing the ‘freedom-of-choice loophole’ (the possibility that the setting choices are influenced by hidden variables to correlate with the particle properties), the utilization of video-game methods for rapid collection of human-generated randomness, and networking techniques enabling global participation in experimental science.