Finding the relationship between genomic sequence variation and disease has been one of major focuses and challenges in biomedicine, as it allows the development of targeted diagnosis and therapy protocols. The possibility of easily decoding the sequence of genomes has recently pushed forward the understanding of disease at unprecedented levels, building the basis of personalized medicine, were each patient will be diagnosed and treated according to his particular genome context. In this sense, the identification of the genomic changes that lead to cancer is essential to understanding tumor variability and opens the door to more precise, personalized and efficient treatments, alternatives to the current unspecific and aggressive therapies. As the majority of tumors arise and evolve from changes (somatic mutations) in the genome of a single cell, current protocols for the identification of the genetic basis of oncogenesis consist of the sequencing and comparison of the genomes of cancer and healthy cells of the same individual. Despite the existence of protocols for finding these mutations and, although many cancer-driving mutations have been already found in tumor genomes, the analysis of cancer genomes still entails important challenges and limitations, leaving a significant number of cancer mutations undetected. In addition, existing protocols are computationally complex and expensive, restricting their use to a few centers with enough expertise and resources and leaving an important fraction of the community with no access to these kinds of protocols. Our contribution to overcome these limitations consisted in developing SMUFIN, an innovative approach that compares tumor and normal genomes directly to identify nearly all types of somatic mutations potentially associated to cancer (from single nucleotide changes, to large chromosomal translocations) in a single execution. SMUFIN, compared to the other existing methods, is also much faster, as it is able to analyze multiple patients at a time and in less than 10 hours. The complete study, published in Nature Biotechnology, includes the positive evaluation of the performance of SMUFIN on different types of tumor genomes. SMUFIN was also able to identify types of complex chromosomal rearrangements, in Mantle Cell Lymphoma and Medulloblastoma samples that are invisible to the other methods and that had been related to aggressive tumors before.Taken together, SMUFIN constitutes the first realistic step forward in the analysis of gen
Every year, a committee of experts sits down with a tough job to do: from among all ICREA publications, they must find a handful that stand out from all the others. This is indeed a challenge. The debates are sometimes heated and always difficult but, in the end, a shortlist of 24 publications is produced. No prize is awarded, and the only additional acknowledge is the honour of being chosen and highlighted by ICREA. Each piece has something unique about it, whether it be a particularly elegant solution, the huge impact it has in the media or the sheer fascination it generates as a truly new idea. For whatever the reason, these are the best of the best and, as such, we are proud to share them here.
LIST OF SCIENTIFIC HIGHLIGHTS
SMUFIN, an innovative approach to find cancer associated mutations (2014)
Torrents Arenales, David (BSC-CNS)view details
A new mechanism for wound healing (2014)
Trepat, Xavier (IBEC)view details
When we think of wound healing, we normally think of wounds to our skin. But wounds happen inside the body in all sorts of tissues and organs, and can have implications in many chronic diseases such as diabetes and asthma. Wounds also favour cancer progression by providing a physical and chemical environment that promotes the invasion of malignant cells.In this study, we designed a new way to decipher the mechanisms of wound healing, and by doing so we uncovered a new understanding of how cells move and work together to close a gap in a tissue.It had been known for a while that two different mechanisms contribute to wound healing. One is the ‘purse-string’ method, where a ring of contractile proteins forms at the edges of the wound and tightens like the strings of a purse. The other one is ‘cell crawling’, when cells themselves throw out ‘arms’ called lamellipodia to drag themselves along to close the gap. In some wounds, both mechanisms are thought to occur simultaneously, and in others, only one of the two is used.Here we pioneered a technique to measure the nano-scale forces behind wound healing. Using this technique, we discovered that the two currently accepted mechanisms are not sufficient to fully explain the phenomenon. Instead, we showed that a new mechanism applies in which cells assemble supracellular-contractile arcs that compress the tissue under the wound.By combining experiments and computational modeling, we showed that contractions arising from these arcs make the wound heal in a quicker and more robust way.Being able to optimize tissue repair is a major need for the treatment of acute and chronic diseases. The discovery of the basic mechanism reported in this study is a new step in the quest to achieve effective organ regeneration.
Brightening up the photosynthetic complex (2014)
van Hulst, Niek F. (ICFO)view details
Life on earth is essentially powered by the sun. Plants, bacteria and algae collect sunlight to store its energy and synthesize high energy molecular species: the process of photosynthesis. The photosynthetic complexes which harvest the sunlight transfer the light energy very effectively, with a remarkable transport efficiency, even above 90%. It is thought that nature exploits quantum concepts, such as coherence and delocalisation, to achieve the superior performance. Obviously the light-harvesting complexes are subject of intense study to learn these tricks of nature’s design. Yet, as such complexes did evolve for optimal light collection and transfer, they do not easily lose the light energy. Thus, they emit very little light, and it is hard to unveil their secrets.
Fortunately artificial optical antennas can be designed to enhance the capture and emission of light. Particularly metal nanoantennas do confine light to the nanometer scale and do speed up the photocycle of light emission. Therefore metal nanoantennas seem ideal to address light-harvesting antenna complexes and prompt faster and brighter emission. This is exactly what we have done to brighten-up individual LH2-complexes of the purple bacteria.
We have coupled single LH2-complexes resonantly to a gold nanoantenna. This way the fluorescence emission did speed up to 20 ps decay time and the quantum efficiency was enhanced from only a few percent to 50%. As a result almost 1000 times more emission was collected from a single complex.
Using the bright photon emission of the bacterial complex we could look into its quantum properties at ambient conditions. To our surprise the bacterial complex revealed “anti-bunching” of photons: never two photons are emitted at the same time, the typical hallmark of a non-classical single-photon emitter. Finding quantum character in a room temperature bio-system is peculiar. Even more when one realizes that the LH2 complex contains 27 bacterio-chlorophylls coordinated in two rings with antenna molecules. The 27 molecules all cooperate to act jointly as one quantum system! Clearly the system is coupled and quantum transport plays a role in natural light-harvesting.
Fascinating questions remain: do other biological processes exploit quantum effects; can we learn from nature in the development of more efficient solar cells?
Neutrinos: still skinny after all (2014)
Verde, Licia (UB)view details
Scientists have many tools for measuring the cosmic structures of the universe, which includes structures such as galaxies, galaxy clusters, and intergalactic gas. Cosmic structures can be observed directly, such as by observations of large-scale structure, or indirectly, such as with experiments that measure temperature fluctuations in the cosmic microwave background. Although having multiple ways to measure cosmic structure is beneficial, there is one problem: the measurements don't agree.Recently, several studies have suggested that this disagreement, or tension, in the data can be relieved by massive sterile neutrinos. Neutrinos were originally thought to be massless, but experiments later showed that they do have mass. Massive neutrinos suppress the growth of structures that lead to the formation of galaxy clusters. For this reason, it has been claimed that fairly massive neutrinos resolve the tension between the experimental data and bring the different measurements into better agreement.But in a new study we show that this may not be so. Massive neutrinos (as massive as has been proposed) do not bring about a new cosmological concordance between the measurements. Instead, the apparent concordance may result from systematic biases in the measurements. The results suggest that the tension between the measurements must be resolved either by considering systematics in one or more of the data sets, or—if further investigation shows that correcting systematic effects does not resolve the tension—then new physics other than the introduction of massive neutrinos must be considered.In terms of a larger framework for viewing cosmic structure, the results strongly favor the cold dark matter model with a cosmological constant over more complex models that are extended with massive neutrinos. Yet, as the researchers explain, the questions surrounding cosmic structure are still far from being answered.There is no evidence for significantly massive neutrinos in cosmology. Therefore there is not yet the need for extending the standard cosmological model to include an extra parameter for neutrino mass. Instead, future investigations will likely focus on finding exactly what the neutrino mass is. Particle physics experiments give a firm lower limit to the neutrino mass, cosmology at present gives an upper limit. But there is not much wiggle room left. The next generation of cosmology surveys will have enough statistical power to really see neut
Full randomness from arbitrarily deterministic events (2013)
Acín Dal Maschio, Antonio (ICFO)view details
Do random events exist in nature? This question has attracted and keeps attracting the interest of many different communities, ranging from philosophers to physicists and mathematicians. Classical physics is deterministic and does not contain any form of randomness. Quantum physics, however, does contain some form of randomness as it is only able to make predictions in probabilistic terms. Yet, the fact that a theory is only able to make probabilistic predictions does not necessarily imply that nature is random, but may simply be a limitation of the predictability power of the theory.In 1964, Bell proved a theorem where he implied that quantum theory cannot be completed, suggesting the existence of an intrinsic form of randomness in the quantum world. Unfortunately Bell's theorem assumes in its derivation the existence of an initial source of perfect randomness, which introduces circularity in the argument: random processes are shown to exist by assuming an initial random source!The necessity of some form of randomness to run a Bell test implies that the strongest proof of randomness one can hope for using quantum physics is the following: does any amount of randomness, however small, suffice to run a Bell test that certifies perfect randomness? In other words, can randomness be amplified in the quantum regime?In our work published in Nature Communications, we provide a solution to this question and indicate that randomness is indeed unavoidable in our description of nature: given an arbitrarily small amount of initial randomness, we show how non-local quantum correlations certify the existence of fully random processes in nature.Beyond the clear implications this achievement has from a fundamental perspective, the obtained results are also relevant for quantum information processing. In fact the study provides the first quantum protocol for full randomness amplification. In randomness amplification, the goal is to extract perfect random bits from a source of arbitrarily imperfect randomness. In a seminal work, Santha and Vazirani proved that randomness amplification is impossible when relying only on classical systems. In our study, we prove that full randomness amplification becomes possible when using quantum resources.
Self-Assembled 0D, 1D and 2D Quantum Structures in-a-Nanowire: direct correlation between physical properties and structure at atomic scale (2013)
Arbiol, Jordi (CSIC - ICMAB)view details
Inherent to the nanowire morphology is the exciting possibility of fabricating materials organized at the nanoscale in three dimensions. Composition and structure can be varied along and across the nanowire, as well as within coaxial shells. This opens up a manifold of possibilities in nanoscale materials science and engineering which is only possible with a nanowire as a starting structure. As the variation in composition and structure is accompanied by a change in the band structure, it is possible to confine carriers within the nanowire. Interestingly, this results in the formation of local two, one and zero-dimensional structures from the electronic point of view and this, within the nanowire. This novel palette of nano-structures paves the way towards novel applications in many engineering domains such as lasers, high-mobility transistors, quantum information and energy harvesting. The quantum structures obtained and analyzed have been grown by molecular beam epitaxy and correspond to different confinement approaches: quantum wells (2D), quantum wires (1D) and quantum dots (0D). The structure and morphology of these quantum structures integrated in single nanowires have been determined at atomic scale by means of aberration corrected STEM, 3D atomic models were obtained and the final enhanced optical properties cross-correlated.