The integration between on the one hand data, modeling and algorithms, and on the other hand the specification, coordination and execution of large scale and data-intensive computational experiments poses a fundamental problem in all scientific disciplines relying on modeling and simulation. Today it is largely left to the modeler or engineer to manually tune models to fit data, to choose algorithms, to configure simulation workflows and to analyze simulation result. This is a big burden to place on e.g. a biologist who is mainly interested in how she can use modeling and simulation to learn new things about a biological system of interest. By utilizing machine learning and cloud computing, we are developing smart systems for scalable and efficient model exploration. An example of a workflow is shown in the image below, where a high-dimensional parameter sweep application is augmented with automated feature extraction and clustering, followed by training a model for classification based on user-defined labels (such as interesting or non-interesting realizations). With this model, the smart sweep application will learn to more efficiently explore areas of interestingness in the parameter space.  

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>