Highlights

Every year, a committee of experts sits down with a tough job to do: from among all ICREA publications, they must find a handful that stand out from all the others. This is indeed a challenge. The debates are sometimes heated and always difficult but, in the end, a shortlist of 24 publications is produced. No prize is awarded, and the only additional acknowledge is the honour of being chosen and highlighted by ICREA. Each piece has something unique about it, whether it be a particularly elegant solution, the huge impact it has in the media or the sheer fascination it generates as a truly new idea. For whatever the reason, these are the best of the best and, as such, we are proud to share them here.

LIST OF SCIENTIFIC HIGHLIGHTS

Format: yyyy
  • Mapping brain activity enabled by graphene  (2019)

    Sánchez-Vives, María Victoria (IDIBAPS)
    Garrido Ariza, Jose A. (ICN2)
    Durduran, Turgut (ICFO)

    view details
    CLOSE

    Recording infraslow brain signals (<0.1 Hz) with microelectrodes is severely hampered by current microelectrode materials, primarily due to limitations resulting from voltage drift and high electrode impedance. Hence, most recording systems include high-pass filters that solve saturation issues but come hand in hand with loss of physiological and pathological information. In this work, we used flexible epicortical and intracortical arrays of graphene solution-gated field-effect transistors (gSGFETs) to map cortical spreading depression and demonstrate that gSGFETs are able to record, with high fidelity, infraslow signals together with signals in the typical local field potential bandwidth. The wide recording bandwidth results from the direct field-effect coupling of the active transistor, in contrast to standard passive electrodes, as well as from the electrochemical inertness of graphene. Taking advantage of such functionality, we envision broad applications of gSGFET technology for monitoring infraslow brain activity both in research and in the clinic.

  • Cellular processes and their respective regulatory mechanisms     (2019)

    Serrano Pubul, Luis (CRG)

    view details
    CLOSE

    Here, we determined the relative importance of different transcriptional mechanisms in the genome-reduced bacterium Mycoplasma pneumoniae, by employing an array of experimental techniques under multiple genetic and environmental perturbations. Of the 143 genes tested (21% of the bacterium's annotated proteins), only 55% showed an altered phenotype, highlighting the robustness of biological systems. We identified nine transcription factors (TFs) and their targets, representing 43% of the genome, and 16 regulators that indirectly affect transcription. Only 20% of transcriptional regulation is mediated by canonical TFs when responding to perturbations. Using a Random Forest, we quantified the non-redundant contribution of different mechanisms such as supercoiling, metabolic control, RNA degradation, and chromosome topology to transcriptional changes. Model-predicted gene changes correlate well with experimental data in 95% of the tested perturbations, explaining up to 70% of the total variance when also considering noise. This analysis highlights the importance of considering non-TF-mediated regulation when engineering bacteria.

  • Cooling  Surfaces Without Consuming Energy? Yes, we can!  (2019)

    Sotomayor Torres, Clivia Marfa (ICN2)

    view details
    CLOSE

    Temperature regulation keeps us comfortable and ensures reliable performance of machines, like computers. Cooling systems account for 15% of the global energy consumption and are responsible for 10% of greenhouse gas emissions. Our phononics research led us to a novel 2-dimensional and plastic-free material able to remove heat, cooling down the surface on which it is placed without energy consumption or gas emissions of any kind. 

    The material is inspired by the Earth’s efficient temperature-regulation mechanism, namely, radiative sky cooling. Although our planet is heated mainly by the sun, it also emits infrared radiation to the outer space, since this kind of radiation is not captured by the atmosphere. Our material is able to cool down a silicon wafer under direct sunlight irradiation by 14 ºC, whereas an ordinary soda-lime glass just lowers it by 5 ºC. It is formed by a single layer of 8 µm diameter self-assembled array of silica spheres, like sand grains but a million times smaller. This layer behaves almost as an ideal infrared emitter, providing a radiative cooling power of up to 350 W/m2 for a hot surface, such as a solar panel.  This would remove half of the heat accumulated in a typical solar panel in a regular clear day, which is enough to increase the relative efficiency of a solar cell by 8%. Considering the global solar energy production in 2017, such an efficiency increase would represent enough energy to power the city of Paris during an entire year.  The physics behind it is the interaction of phonons (quanta of atomic lattice vibrations) and polaritons (quanta of hybrid light-matter excitation), called surface phonon polaritons (see Fig. 1), which have been studied in transport thermal energy over millimetre distances by researchers in France. The layer thickness of our material, which is six times thinner that state-of-the-art radiative cooling materials, is an added bonus. This research was awarded the Collider Tech Award 2019, a prize that encourages further development of this invention, and is protected by an European patent application. 

  • Synchronisation revisited: first nanoscale demonstration (2019)

    Sotomayor Torres, Clivia Marfa (ICN2)

    view details
    CLOSE

    Synchronisation was discovered in 1665 by Lord Huygens reporting on his experiments on two pendulum clocks. So far synchronisation has been studied extensively at macroscopic scales. Not so in the nanoscale regime. And why not? Because it is challenging to obtain similar results in that regime: the design of the structures, the fabrication tolerances and the measurement methodology must be reproducible. Essential to spontaneous synchronisation between two systems are specific conditions: both systems must be self-sustained oscillators, meaning that they are able to generate their own rhythms, without the need of an external source. Moreover, they must synchronise due to a weak interaction, i.e., not because the systems are strongly connected. In our research these two conditions were unambiguously fulfilled for the first time, for two coupled nanobeams or optomechanical oscillators (see Fig. 1), which are 10 000 times smaller than Huygens’ pendulum clocks.  The coupling was engineered as a narrow beam at one end of the oscillators. Unlike Huygens’ pendulums, which were driven by the mechanical motion of the clock, our OM oscillators are driven to self-sustained oscillations by means of radiation pressure forces exerted on them by infrared lasers. Each oscillator, A and B, receives light from a different laser, reaching its own vibration frequency. Oscillator A is set to have stronger oscillations, i.e., a larger amplitude than oscillator B. The experiment shows that thanks to the design of the beam linking them, the frequency of B becomes that of A, thus achieving synchronisation.  We were able to control the collective dynamics (the synchronised state) and concluded that the more oscillators there are, the less noise the system generates, yielding better defined frequencies. 

    This research sets  a firm basis towards reconfigurable networks of nano-optoelectro-mechanical oscillators for applications in, e.g., neuromorphic photonic computing, a field of research that intends to imitate neurological structures to improve computation. 

  • A scientific journey from laboratory idea to new cancer therapy (2019)

    Soucek, Laura (VHIO)

    view details
    CLOSE

    Everybody working in cancer research dreams of the ideal cancer drug that could attack cancer cells, but not normal tissues. However, the majority of our targets so far are in the most redundant compartments of cells that can quickly rewire to compensate for our attacks. Hence, novel opportunities might lie in the identification of less evolutionarily degenerate nodes in cancer. Some of these functions might be identified in the nuclei of cells (a compartment less accessible to standard drugs), where many proteins are intrinsically disordered, lacking a defined three-dimensional structure amenable to attack by canonical small molecule inhibitors. But are these challenges sufficient to dismiss these targets as “undruggable”? Our answer is definitely not.

    In our last publication in Science Translational Medicine [1], we established the feasibility of pharmacologically targeting MYC, the most infamous “undruggable target” deregulated in the majority of human cancers, by making use of a cell-penetrating polypeptide called Omomyc. I designed Omomyc when I was still a student, and I used it in genetically engineered models to establish the therapeutic potential of MYC inhibition to stop tumor progression. However, Omomyc was deemed too bulky and unfit to ever become a drug. And this is where our last publication is really proving this assumption wrong.  In [1], we showed that a purified, recombinantly produced Omomyc polypeptide has cell-penetrating properties and is disruptive in multiple ways: not only can Omomyc sequester MYC in complexes unable to recognize DNA, but it shows a dramatic therapeutic effect in Non-Small lung cancer, while displaying safety and lack of toxicity even upon long term treatment.

    Overall, our results provide the first evidence and preclinical validation of the Omomyc mini-protein as an excellent candidate for clinical development. Indeed, clinical trials are now planned for Q1 2021 to test its clinical value in patients.

  • Scientific Innovation and Scientific Rationality: A Conceptual Explication and a Dilemma (2019)

    Sturm, Thomas (UAB)

    view details
    CLOSE

    Scientists are often asked to promote innovation and aid society by, for instance, novel drugs and therapies, means of communication, ways of making technical devices more energy efficient, or methods for teaching mathematics to schoolchildren. Increasingly, they are also invited (if not urged) to innovate science itself. Universities, grant agencies, and governments encourage researchers to devise novel questions, methods, concepts, theories, goals, instruments, and even research institutions. But while the terminology of innovation is widely used, all too often this is rhetorical rather than reflective. The article aims to foster philosophical debate concerning scientific innovation. 

    As with innovations in markets, we can usefully view scientific innovation as one stage within a larger process from invention to diffusion; more specifically, innovation is a consequence of those inventions that are recognized as useful for changing research in non-incremental ways. However, unlike ‘discovery’, ‘innovation’ applies to elements that make possible, but do not by themselves establish or guarantee, correct research outputs. Most importantly, we assume that innovations are deliberately prepared and accepted, given that they imply violations or revisions of established rules of science. In this sense, innovation presupposes at least minimal rationality. This, however, leads to a tension between two plausible claims: (1) scientific innovation can be explained rationally; (2) no existing account of rationality explains scientific innovation. In particular, I argue that neither standard nor bounded theories of rationality can deliver a satisfactory explanation of scientific innovations. At the moment, it is unclear with what to replace them. Thus, despite our legitimate interest in scientific innovation, calls for research proposals and submissions should be formulated in more reflective and careful ways; and  we should not be excessively optimistic concerning our ability to rationally predict and steer the future direction of the scientific enterprise.

Pages