The practical problems that arise when going from simple models to big models include:

  • Most simulation algorithms do not scale well to high dimensions.
  • Simulation can becomes prohibitively expensive due to the multiscale nature of systems.
  • Failure of traditional engineering methodology such as sensitivity analysis and optimization due to high dimensionality, non-linearities and stochasticity….
  • … which leads to extreme computational cost for key tasks such as .parameter inference, model selection and model exploration.

Big models need big computation and big data, and this comes with its own challenges:

  • How do we manage data transport and data processing for big and fast scientific data when resources are limited?
  • How can parties collaborate to build and parametrize models without sharing/pooling data when it is hard to move, private or sensitive?
  • How can we design robust software that can leverage massive, heterogenous distributed cloud, fog and edge resources?
  • How can we manage large-scale scientific e-infrastructure such as science clouds in scalable, cost-effective and energy-efficient ways?

Leave a comment

Your email address will not be published. Required fields are marked *