With the ever increasing innovation in measurement technology and its associated increase in the amount of data generated, the existing Non-Linear Mixed Effects (NLME) methodology is unlikely to cope with the challenges posed by population Pharmacokinetics/Pharmacodynamics (PK/PD) in the near future. For example, data from EEG, MRI, or PET scans amounts to multiple mega-, giga-, or potentially even terabytes, with an associated increase in the time required to execute an individual step in NLME model fitting. Similarly, more complex tier 1 models such as stochastic ordinary differential equations (ODEs), partial differential equations (PDEs), and Markovian and survival models, will push the limits of the current algorithms and implementations. To be able to face the challenges of the near future with the ever increasing complexity of the data and the associated models, increased performance both at the algorithmic and software/hardware levels are deemed crucial.
To tackle this, the lab is conducting research on the computational requirements for hierarchical model fitting and the increasing complexity of disease models and how they match with available data.
- Invited talk on elPrep at the (IB)² Seminar on March 20, 2015
- New results and software on HPC sparse computation published
- Videos from the “High Performance Computing (HPC) for Life Sciences” event now available
- Jörg Kurt Wegner (Johnson & Johnson) giving closing keynote at EuroQSAR 2014
- Tutorial on GASPI, the PGAS API