Stefan Hellander joins the lab


We are delighted to have Stefan Hellander join the lab!  Stefan obtained his Ph.D. in scientific computing from Uppsala University in 2013, with the thesis “Stochastic Simulation of Reaction-Diffusion Processes”, advised by Prof. Em. Per Lötstedt. He then went on to work as a postdoc in the lab of Prof. Linda Petzold at UCSB, until returning to Uppsala in August of 2017. His research interests include microscopic and mesoscopic modeling and simulation of reaction-diffusion systems, as well as multiscale modeling with the aim of accurately integrating the different scales.

HASTE: Hierarchical Analysis of Spatial and Temporal Data

The HASTE project, a SSF-funded project on computational science and big data, takes a holistic approach to new, intelligent ways of processing and managing very large amounts of microscopy images to leverage the imminent explosion of image data from modern experimental setups in the biosciences. One central idea is to represent datasets as intelligently formed and maintained information hierarchies, and to prioritize data acquisition and analysis to certain regions/sections of data based on automatically obtained metrics for usefulness and interestingness.

The project is a collaboration between the Wählby lab (PI),  Hellander lab (co-PI), both at the Department of Information Technology, Uppsala University, the Spjuth lab (co-PI) at the Department of Pharmaceutical Biosciences, Uppsala University,  the Nilsson lab at the Department of Biochemistry and Biophysics at Stockholm University and SciLifeLab, Vironova AB and AstraZeneca AB.

Read more on the project webpage.


HASTE is granted 29 MSEK funding from SSF

Our project Hierarchical Analysis of Spatial and TEmporal Data (HASTE) is granted 29 MSEK funding from SSF. The project, with PI Carolina Wählby  and co-PIs Andreas Hellander, Ola Spjuth and Mats Nilsson, will explore new ways to gain insight from massive amounts of spatial and temoral image data through hierarchical analysis models and smart cloud systems for managing data, see

Simulation of Stochastic Multicellular Systems

While we have learned a lot about gene regulation and control from single cell models, there is a limit to what can be understood without considering cell-cell interaction. However, there is a fundamental computational gap between detailed models of single cells and models of multicellular systems comprising of large number of interacting cells such as bacterial colonies, tissue and tumors.

We seek to bridge the vast computational gap between quantitative, stochastic models of intracellular regulatory pathways and coarse-level models of multicellular systems. We also engage in development of simulation methodology for modeling specific biological systems toghether with collaborators.


Recent publications:

  • Marketa Kaucka, Evgeny Ivashkin, Daniel Gyllborg, Tomas Zikmund, Marketa Tesarova, Jozef Kaiser, Meng Xie, Julian Petersen, Vassilis Pachnis, Silvia K Nicolis , Tian Yu, Paul Sharpe, Ernest Arenas, Hjalmar Brismar, Hans Blom, Hans Clevers , Ueli Suter, Andrei S Chagin, Kaj Fried, Andreas Hellander and Igor Adameyko, (2016) Analysis of neural crest-derived clones reavals novel aspects of facial development, Science Advances 2(8). 



Smart systems for computational experiments

The integration between on the one hand data, modeling and algorithms, and on the other hand the specification, coordination and execution of large scale and data-intensive computational experiments poses a fundamental problem in all scientific disciplines relying on modeling and simulation. Today it is largely left to the modeler or engineer to manually tune models to fit data, to choose algorithms, to configure simulation workflows and to analyze simulation result. This is a big burden to place on e.g. a biologist who is mainly interested in how she can use modeling and simulation to learn new things about a biological system of interest. By utilizing machine learning and cloud computing, we are developing smart systems for scalable and efficient model exploration. An example of a workflow is shown in the image below, where a high-dimensional parameter sweep application is augmented with automated feature extraction and clustering, followed by training a model for classification based on user-defined labels (such as interesting or non-interesting realizations). With this model, the smart sweep application will learn to more efficiently explore areas of interestingness in the parameter space.  

Software and applied cloud computing

Open source computational science and engineering (CSE) software is an integral part of methodology-oriented computational research and a priority in the group. Due to the ongoing transformation of e-infrastructure to clouds, methods and workflows that promote horizontal scalability and elasticity for cloud applications are needed, and this may in many cases require re-thinking of how we best make use of computational resources. Other important questions include reproducibility and handling of large and complex data. 

Selected recent publications: