HASTE: Hierarchical Analysis of Spatial and Temporal Data

The HASTE project, a SSF-funded project on computational science and big data, takes a holistic approach to new, intelligent ways of processing and managing very large amounts of microscopy images to leverage the imminent explosion of image data from modern experimental setups in the biosciences. One central idea is to represent datasets as intelligently formed and maintained information hierarchies, and to prioritize data acquisition and analysis to certain regions/sections of data based on automatically obtained metrics for usefulness and interestingness.

The project is a collaboration between the Wählby lab (PI),  Hellander lab (co-PI), both at the Department of Information Technology, Uppsala University, the Spjuth lab (co-PI) at the Department of Pharmaceutical Biosciences, Uppsala University,  the Nilsson lab at the Department of Biochemistry and Biophysics at Stockholm University and SciLifeLab, Vironova AB and AstraZeneca AB.

Read more on the project webpage.

 

Simulation of Stochastic Multicellular Systems

While we have learned a lot about gene regulation and control from single cell models, there is a limit to what can be understood without considering cell-cell interaction. However, there is a fundamental computational gap between detailed models of single cells and models of multicellular systems comprising of large number of interacting cells such as bacterial colonies, tissue and tumors.

We seek to bridge the vast computational gap between quantitative, stochastic models of intracellular regulatory pathways and coarse-level models of multicellular systems. We also engage in development of simulation methodology for modeling specific biological systems toghether with collaborators.

 

Recent publications:

  • Marketa Kaucka, Evgeny Ivashkin, Daniel Gyllborg, Tomas Zikmund, Marketa Tesarova, Jozef Kaiser, Meng Xie, Julian Petersen, Vassilis Pachnis, Silvia K Nicolis , Tian Yu, Paul Sharpe, Ernest Arenas, Hjalmar Brismar, Hans Blom, Hans Clevers , Ueli Suter, Andrei S Chagin, Kaj Fried, Andreas Hellander and Igor Adameyko, (2016) Analysis of neural crest-derived clones reavals novel aspects of facial development, Science Advances 2(8). 

Collaborators:

 

Smart systems for computational experiments

The integration between on the one hand data, modeling and algorithms, and on the other hand the specification, coordination and execution of large scale and data-intensive computational experiments poses a fundamental problem in all scientific disciplines relying on modeling and simulation. Today it is largely left to the modeler or engineer to manually tune models to fit data, to choose algorithms, to configure simulation workflows and to analyze simulation result. This is a big burden to place on e.g. a biologist who is mainly interested in how she can use modeling and simulation to learn new things about a biological system of interest. By utilizing machine learning and cloud computing, we are developing smart systems for scalable and efficient model exploration. An example of a workflow is shown in the image below, where a high-dimensional parameter sweep application is augmented with automated feature extraction and clustering, followed by training a model for classification based on user-defined labels (such as interesting or non-interesting realizations). With this model, the smart sweep application will learn to more efficiently explore areas of interestingness in the parameter space.  

Software and applied cloud computing

Open source computational science and engineering (CSE) software is an integral part of methodology-oriented computational research and a priority in the group. Due to the ongoing transformation of e-infrastructure to clouds, methods and workflows that promote horizontal scalability and elasticity for cloud applications are needed, and this may in many cases require re-thinking of how we best make use of computational resources. Other important questions include reproducibility and handling of large and complex data. 

Selected recent publications: 

Multiscale simulations of chemical kinetics

Life spans in size from small organisms consisting of single cells to complex organisms built up of billions of cells. Even the single-cell organisms are challenging to fully understand and study—their function is dependent on a rich set of reaction networks. Important molecules inside a cell may exist in only a few copies, and that makes them exceedingly difficult and costly to study.

The aim of our research is to develop algorithms and software that can assist in discoveries in basic science and medicine. We use mathematical models to describe how molecules move and interact inside cells, and then simulate these models to gain an understanding of how cells work. The multiscale nature of the problem is an interesting challenge. At the finest level we would consider single biomolecules and their exact molecular structure. There are models and methods for simulating systems at that level, but they are computationally expensive.

We couldn’t simulate the behavior of a large, complex system with such a method. Instead of considering the true structure of molecules, we could use a model that approximates them by spheres. At this level we can simulate medium-sized systems inside a cell on a time scale of seconds to minutes. An even more coarse-grained model doesn’t model individual molecules, but counts the number of molecules of different species in different parts of the domain. At this scale we can simulate bigger systems for hours.

We have developed methods with the aim of coupling accurate fine-grained methods with less computationally expensive coarse-grained methods. In doing so, we obtain methods that are more accurate than the coarse-grained method, but still more efficient than the fine-grained method. These methods are called multiscale methods. By adding scales to our simulations—more accurate models, incorporating some of the many complex internal structures that are vital to the function of the cell, but also more coarse-grained models, we attempt to move beyond the boundaries of what is currently possible to simulate with state-of-the-art methods.

Recent publications: