An ecosystem model is an abstract, usually mathematical, representation of an ecological (sub)system that is studied to better understand the real system. Ecological systems consist of an enormous number of biotic and abiotic factors that interact in often unknows ways or are so complex that they cannot be fully integrated into a computable model.
The process of modelling consists of 2 steps:
- Abstraction/Simplification:
Because of this complexity, ecosystem models are typically simplifications consisting of a limited number of components that are well understood and considered relevant to the problem. The number of system components that enter the model is reduced by grouping similar processes and entities into functional groups that are then treated as a single entity.
- Translation into the language of mathematics (not programming language!)
The process of Abstraction/Simplification often leads to smaller number of state variables. The underlying physical, chemical or biological processes and the relationships between them are then described with mathematical equations.
Therefore, for the task of modeling an apriory knowledge is needed of how the physical, chemical, and biological processes which determine the behavior of an ecosystem can be described with mathematical equations.
- Cellular Automata and Agent-Based Modeling by Tobias Müller (2014)
n this mini lecture, I will show how spatial systems can be modeled using cellular automata and agent based approaches. The idea is to subdivide a system into its basic components and to define rules under which these components interact. The motivation is to understand system dynamics by reducing the complexity of the real world, allowing one to focus on the most basic process interactions which are assumed to govern the system behavior. As an example, a simple vegetation distribution is modeled. The most basic system components are 5x5m landscape cells. A few cells randomly receive seeds of yellow flowers, which grow and spread from cell to cell through the dispersal of plant seeds. Similarly, a blue flower is introduced and a competition over locations between the two types begins. Finally, an agent is introduced in the form of a moose, who likes to eat blue flowers. I will discuss snippets of the model code and show how easy it is to set up this kind of model in the NetLogo framework I used.
▼
- CoMSES Net Computational Model Library
CoMSES Net maintains cyberinfrastructure to foster FAIR data principles for access to and (re)use of computational models. Model authors can publish their model code in the Computational Model Library with documentation, metadata, and data dependencies and support these FAIR data principles as well as best practices for software citation.
The Computational Model Library preserves computational / agent based models in accordance to FAIR data principles and FORCE11 Software Citation Working Group recommendations.
▼
- CoMSES Catalog
A bibliographic database of 7673 publications of agent-based and individual-based models in all applications domains, from cancer modeling to ancient societies
With this catalog you can explore the world of agent-based and individual-based models and find models of your interest.
▼
- Methods for Designing Cellular Automata with "Interesting" Behavior by Gutowitz, Howard and Langton, C. (1994)
Cellular automata are dynamical systems in which space, time, and the states of the system are discrete. Each cell in a regular lattice changes its state with time according to a rule which is local and deterministic. All cells on the lattice obey the same rule. Given a randomly-chosen cellular automaton rule, how do we expect it will
behave? The following is an informal account of some attempts to answer this question quantitatively.
▼
- Individual-Based Models by Craig Reynolds (2007)
A collection of models based on simulating the behaviour of individuals (plants and animals in ecosystems, vehicles in traffic, characters in animation) and the global consequences of their local interactions.
▼
- Complex Systems Modeling by Cellular Automata by Kroc, Jiří and Sloot, Peter (2009)
[DOI]
In recent years, the notion of complex systems proved to be a very useful concept to define, describe, and study various natural phenomena observed in a vast number of scientific disciplines. Examples of scientific disciplines that highly benefit from this concept range from physics, mathematics, and computer science through biology and medicine as well as economy, to social sciences and psychology. Various techniques were developed to describe natural phenomena observed in these complex systems. Among these are artificial life, evolutionary computation, swarm intelligence, neural networks, parallel computing, cellular automata, and many others. In this text, we focus our attention to one of them, i.e. 'cellular automata'.
▼
- ACE+ Suite Computational Fluid Dynamics
ACE+ Suite is an advanced CFD and high-fidelity Multiphysics simulation software package supporting the automotive, semiconductor, energy, microfluidics, and biotech industries, amongst others. With ACE+ Suite, engineers can virtually test the performance and behavior of their designs before they are manufactured, for the most extreme innovation requirements.
This tool offers the broadest range of physics disciplines including flow, heat transfer, stress/deformation, chemical kinetics, electrochemistry, biochemistry, electrostatics, electromagnetics, hypersonics, plasma and more – in any combination from a single environment.
ACE+ Suite also includes Design Space Exploration, Optimization and Reduced Order Model modules allowing you to fully understand and fine-tune designs.
▼
- Ansys Fluent | Fluid Simulation Software
FLUENT is a Computational Fluid Dynamics (CFD) code for modelling fluid flow, heat transfer, mass transfer and chemical reactions.The FLUENT package includes interfaces to other pre- and post-processing programs. The UDF option enables the incorporation of user-developed models into FLUENT through user-defined functions.32 and 64 bit versions of the 2d and 3d single and double precision versions of Fluent have been installed.The primary CFD application on BlueBEAR is Ansys CFX - in particular, parallel licences have been purchased for CFX.. Fluent has ben installed to enable continued use of this application, but only serial (single core) jobs can be run.
▼
- PETSc - FEM: A General Purpose, Parallel, Multi-Physics FEM Program
PETSc-FEM is a general purpose, parallel, multi-physics FEM (Finite Element Method) program for CFD (Computational Fluid Dynamics) applications based on PETSc . PETSc-FEM comprises both a library that allows the user to develop FEM (or FEM-like, i.e. non-structured mesh oriented) programs, and a suite of application programs. It is written in the C++ language with an OOP (Object Oriented Programming) philosophy, keeping in mind the scope of efficiency. PETSc-FEM may run in parallel using the MPI standard on a variety of architectures, but currently it has been tested in Beowulf clusters only.
▼
-
CFD codes list - free software
-
Free and Low-Cost CFD Software
- swMath:
Computational Fluid Dynamics
- GAMS/NIST : Guide to Available Mathematical Software
A cross-index and virtual repository of mathematical and statistical software components of use in computational science research.
▼
- GSL - GNU Scientific Library
The GNU Scientific Library (GSL) is a numerical library for C and C++ programmers. It is free software under the GNU General Public License.
The library provides a wide range of mathematical routines such as random number generators, special functions and least-squares fitting. There are over 1000 functions in total with an extensive test suite.
Unlike the licenses of proprietary numerical libraries the license of GSL does not restrict scientific cooperation. It allows you to share your programs freely with others.
Key Algorithmic Areas: Complex Numbers, Roots of Polynomials, Special Functions, Vectors and Matrices, Permutations, Sorting, BLAS Support, Linear Algebra, Eigensystems, Fast Fourier Transforms, Quadrature, Random Numbers, Quasi-Random Sequences, Random Distributions, Statistics, Histograms, N-Tuples, Monte Carlo Integration. Simulated Annealing, Differential Equations, Interpolation, Numerical Differentiation, Chebyshev Approximation, Series Acceleration, Discrete Hankel Transforms, Root-Finding, Minimization, Least-Squares Fitting, Physical Constants, IEEE Floating-Point, Discrete Wavelet Transforms, Basis splines, Running Statistics,Sparse Matrices and Linear Algebra.
▼
- IMSL Numerical Libraries
The IMSL Numerical Libraries by Perforce offer battle-tested, high-ROI numerical libraries for advanced data analysis and forecasting applications.
Key Algorithmic Areas: Basic Statistic, Distribution Functions, Time Series and Forecasting, Maximum Likelihood Estimation, Nonparametric Statistics, Generalized Linear Models, Correlation & Covariances, Random Number Generation, Analysis of Variance, Hypothesis Testing, Goodness of Fit Tests, Design of Experiments, Probability Density and Cumulative, Multivariate Analysis, Optimization, Differential Equations, Linear and Non-linear Programming, Feynman-Kac Solver, Matrix Operations, Transforms, Linear Systems, Nonlinear Equations, Eigensystem Analysis, Special Functions, Interpolation and Approximation, Integration and Differentiation, Decision Trees, Support vector Machines, Regression, Genetic Algorithm, Vector Auto-Regression, Naïve Bayes, Vector Error Correction Model, Bayesian Seasonal Time Series, Cluster Analysis, Logistic Regression, Kohonen Self Organizing Maps, Principal Components Analysis, Neural Networks, Factor Analysis, ARIMA, Discriminant Analysis.
Languages: C, Fortran, Java, Python
▼
- LAPACK — Linear Algebra PACKage
LAPACK is written in Fortran 90 and provides routines for solving systems of simultaneous linear equations, least-squares solutions of linear systems of equations, eigenvalue problems, and singular value problems. The associated matrix factorizations (LU, Cholesky, QR, SVD, Schur, generalized Schur) are also provided, as are related computations such as reordering of the Schur factorizations and estimating condition numbers. Dense and banded matrices are handled, but not general sparse matrices. In all areas, similar functionality is provided for real and complex matrices, in both single and double precision.
▼
- The NAG Library
The world's largest collection of robust, documented, tested and maintained numerical algorithms. NAG Library algorithms are inherently flexible – they can be called from a range of languages including C and C++, VBA, Python, Java, .NET and Fortran.
Key Algorithmic Areas: Mathematical Optimization, Statistics & Machine Learning, Algorithmic Differentiation, Special Functions, Linear Algebra, PDEs, Interpolation, Curve & Surface Fitting and Numerical Integration
▼
- Netlib
Netlib is a scientific computing program library maintained by AT&T, Bell Laboratories, the University of Tennessee, and Oak Ridge National Laboratory. Netlib includes a large number of different programs and program libraries. Most of the code is written in Fortran.
Some well-known packages maintained in Netlib are:
- AMPL Solver Library (ASL),
- Basic Linear Algebra Subprograms (BLAS),
- EISPACK,
- LAPACK,
- LINPACK,
- MINPACK,
- QUADPACK
▼
- Numerical Recipes
The Numerical Recipes books cover a range of topics that include both classical numerical analysis (interpolation, integration, linear algebra, differential equations, and so on), signal processing (Fourier methods, filtering), statistical treatment of data, and a few topics in machine learning (hidden Markov model, support vector machines).
Available as book and online.
▼
- ODEPACK | Fortran ODE Solvers
ODEPACK is a collection of Fortran solvers for the initial value problem for ordinary differential equation systems. It consists of nine solvers, namely a basic solver called LSODE and eight variants of it. The collection is suitable for both stiff and nonstiff systems.
It is open software.
▼
- PETSc/TAO
Portable, Extensible Toolkit for Scientific Computation (PETSc), Toolkit for Advanced Optimization (TAO)
PETSc is a suite of data structures and routines for the scalable (parallel) solution of scientific applications modeled by partial differential equations. It supports MPI, and GPUs through CUDA or OpenCL, as well as hybrid MPI-GPU parallelism. PETSc (sometimes called PETSc/TAO) also contains the TAO optimization software library.
▼
Model validation is the process by which model outputs are (systematically) compared to independent real-world observations to judge the quantitative and qualitative correspondence with reality.
Mathematical models are developed to improve the understanding of a system and to make predictions regarding system behavior. Typically, these models often depend on poorly defined or not measurable parameters/boundary conditions to which a value must be assigned. Fitting a model to the observed data, called Inverse modeling, is often the only way to find reasonable values for these parameters.
Under certain conditions, inverse problems can be viewed as an optimization problems in which model parameters are modified so that predictions from forward models can match measurements as closely as possible. Moreover, prior knowledge can be used to optimize the objective function and obtain the maximum a posteriori estimate of the model parameters.
The objective function is usually the (weighted) sum of squared residuals.
With the
optimization algorithm (minimization) the minimum of the objective function is searched.
Common algorithms are the
Rosenbrock method, the
Levenberg-Marquardt method,
Nelder Mead's simplex method or the
Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm.
A major difference between these methods is the computation/approximation of the Jacobian and Hessian matrices. Convergence of these methods are not guaranteed, and it converges only to a local optimum that depends on the starting parameters. Therefore, random-based (evolution) strategies are also often used.
A new approach based on Theory-guided Neural Network (TgNN) is described in Wang et al. (2021).
But the task of inverse modeling is
not completed with the parameter estimation, a further essential aspect of inverse modeling is the assessment of the goodness of the estimated parameters.
In this context
- the identifiability of the parameters to be estimated,
- the variance and confidence intervals of the estimated parameters,
- the analysis of residuals (if the residuals contain deterministic trends, this indicates an inadequate model formulation) and
- a cross-validation
are in the center of focus.
- Broyden–Fletcher–Goldfarb–Shanno algorithm - Wikipedia
In numerical optimization, the Broyden–Fletcher–Goldfarb–Shanno (BFGS) algorithm is an iterative method for solving unconstrained nonlinear optimization problems. Like the related Davidon–Fletcher–Powell method, BFGS determines the descent direction by preconditioning the gradient with curvature information. It does so by gradually improving an approximation to the Hessian matrix of the loss function, obtained only from gradient evaluations (or approximate gradient evaluations) via a generalized secant method.
▼
- Improved Nelder Mead’s Simplex Method and Applications [PDF] by Pham, Nam and Wilamowski, Bogdan M. (2011)
Nelder Mead’s simplex method is known as a fast and widely used algorithm in local minimum optimization. However, this algorithm by itself does not have enough capability to optimize large scale problems or train neural networks. This paper will present a solution to improve this deficiency of Nelder Mead’s simplex algorithm by incorporating with a quasi gradient method. This method approximates gradients of a function in the vicinity of a simplex by using numerical methods without calculating derivatives, and it is much simpler than analytical gradient methods in mathematic perspectives. With this solution, the improved algorithm can converge much faster with higher success rate and still maintain the simplicity of simplex method. Testing results with several benchmark optimization problems of this improved algorithm will be compared with Nelder Mead’s simplex method. Then this algorithm will be applied in synthesizing lossy ladder filters and training neural networks to control robot arm kinematics. These typical applications are used to show the ability of the improved algorithm to solve the varieties of engineering problems.
▼
- The Levenberg-Marquardt Algorithm
The Levenberg-Marquardt (LM) algorithm is the most widely used optimization algorithm. It outperforms simple gradient descent and other conjugate gradient methods in a wide variety of problems. The LM algorithm is a blend of vanilla gradient descent and Gauss-Newton iteration. Subsequently, another perspective on the algorithm is provided by considering it as a trust-region method.
▼
- Numerische Optimierung von Computer-Modellen mittels der Evolutionsstrategie by Schwefel, Hans-Paul (1977)
[DOI]
A comparative introduction to hill-climbing and random strategies.
▼
- Rosenbrock method for optimization of non-linear function.
The Rosenbrock method is a 0th order search algorithm (it means it does not require any derivatives of the target function. Only simple evaluations of the objective function are used). Yet, it approximates a gradient search thus combining advantages of 0th order and 1st order strategies. It was published by Rosenbrock in the 70th.
This method is particularly well suited when the objective function does not require a great deal of computing power. In such a case, it's useless to use very complicated optimization algorithms. We will lose our time in the optimization calculations instead of making a little bit more evaluations of the objective function which will lead, at the end, to a shorter calculation time.
▼
- On structural identifiability by Bellman, R. and Åström, K. J. (1970)
[DOI]
In this article a new concept is introduced, structural identifiability, which plays a central role in identification problems. The concept is useful when answering questions such as: To what extent is it possible to get insight into the internal structure of a system from input-output measurements? What experiments are necessary in order to determine the internal couplings uniquely? The definition of the concept of an identifiable structure is given. Criteria as well as certain identifiable structures are discussed. Particular emphasis is given to compartmental models.
▼
- Structural Identifiability of Dynamic Systems Biology Models by Villaverde, Alejandro F. and Barreiro, Antonio and Papachristodoulou, Antonis (2016)
[DOI]
A powerful way of gaining insight into biological systems is by creating a nonlinear differential equation model, which usually contains many unknown parameters. Such a model is called structurally identifiable if it is possible to determine the values of its parameters from measurements of the model outputs. Structural identifiability is a prerequisite for parameter estimation, and should be assessed before exploiting a model. However, this analysis is seldom performed due to the high computational cost involved in the necessary symbolic calculations, which quickly becomes prohibitive as the problem size increases. In this paper we show how to analyse the structural identifiability of a very general class of nonlinear models by extending methods originally developed for studying observability. We present results about models whose identifiability had not been previously determined, report unidentifiabilities that had not been found before, and show how to modify those unidentifiable models to make them identifiable. This method helps prevent problems caused by lack of identifiability analysis, which can compromise the success of tasks such as experiment design, parameter estimation, and model-based optimization. The procedure is called STRIKE-GOLDD (STRuctural Identifiability taKen as Extended-Generalized Observability with Lie Derivatives and Decomposition), and it is implemented in a MATLAB toolbox which is available as open source software. The broad applicability of this approach facilitates the analysis of the increasingly complex models used in systems biology and other areas.
▼
- UCODE, a computer code for universal inverse modeling by Eileen P. Poeter and Mary C. Hill (1999)
[DOI]
This article presents the US Geological Survey computer program UCODE, which was developed in collaboration with the US Army Corps of Engineers Waterways Experiment Station and the International Ground Water Modeling Center of the Colorado School of Mines. UCODE performs inverse modeling, posed as a parameter-estimation problem, using nonlinear regression. Any application model or set of models can be used; the only requirement is that they have numerical (ASCII or text only) input and output files and that the numbers in these files have sufficient significant digits. Application models can include preprocessors and postprocessors as well as models related to the processes of interest (physical, chemical and so on), making UCODE extremely powerful for model calibration. Estimated parameters can be defined flexibly with user-specified functions. Observations to be matched in the regression can be any quantity for which a simulated equivalent value can be produced, thus simulated equivalent values are calculated using values that appear in the application model output files and can be manipulated with additive and multiplicative functions, if necessary. Prior, or direct, information on estimated parameters also can be included in the regression. The nonlinear regression problem is solved by minimizing a weighted least-squares objective function with respect to the parameter values using a modified Gauss–Newton method. Sensitivities needed for the method are calculated approximately by forward or central differences and problems and solutions related to this approximation are discussed. Statistics are calculated and printed for use in (1) diagnosing inadequate data or identifying parameters that probably cannot be estimated with the available data, (2) evaluating estimated parameter values, (3) evaluating the model representation of the actual processes and (4) quantifying the uncertainty of model simulated values. UCODE is intended for use on any computer operating system: it consists of algorithms programmed in perl, a freeware language designed for text manipulation and Fortran90, which efficiently performs numerical calculations.
▼
- Parameter Identifiability of Fundamental Pharmacodynamic Models by Janzén, David L. I. and Bergenholm, Linnéa and Jirstrand, Mats and Parkinson, Joanna and Yates, James and Evans, Neil D. and Chappell, Michael J. (2016)
[DOI]
Issues of parameter identifiability of routinely used pharmacodynamics models are considered in this paper. The structural identifiability of 16 commonly applied pharmacodynamic model structures was analysed analytically, using the input-output approach. Both fixed-effects versions (non-population, no between-subject variability) and mixed-effects versions (population, including between-subject variability) of each model structure were analysed. All models were found to be structurally globally identifiable under conditions of fixing either one of two particular parameters. Furthermore, an example was constructed to illustrate the importance of sufficient data quality and show that structural identifiability is a prerequisite, but not a guarantee, for successful parameter estimation and practical parameter identifiability. This analysis was performed by generating artificial data of varying quality to a structurally identifiable model with known true parameter values, followed by re-estimation of the parameter values. In addition, to show the benefit of including structural identifiability as part of model development, a case study was performed applying an unidentifiable model to real experimental data. This case study shows how performing such an analysis prior to parameter estimation can improve the parameter estimation process and model performance. Finally, an unidentifiable model was fitted to simulated data using multiple initial parameter values, resulting in highly different estimated uncertainties. This example shows that although the standard errors of the parameter estimates often indicate a structural identifiability issue, reasonably "good" standard errors may sometimes mask unidentifiability issues.
▼
- Solving inverse problems in building physics: An overview of guidelines for a careful and optimal use of data by Rouchier, Simon (2018)
[PDF] [DOI]
The purpose of the present article is twofold. First, it is a tutorial on the formalism of inverse problems in building physics and the most common ways to solve them. Then, it provides an overview of tools and methods that can either be used to assess or the reliability of inverse problem results, prevent erroneous interpretation of data, and optimise information gained by experiments. It provides an introduction, along with useful references, to the topics of estimation error assessment, regularisation, identifiability analysis, residual analysis, model selection and optimal experiment design. These concepts are presented in the context of building simulation and energy performance assessment: a simple RC model is used as a running example to illustrate each chapter.
▼
- Inverse Modelling, Sensitivity and Monte Carlo Analysis in R Using Package FME by Soetaert, Karline and Petzoldt, Thomas (2010)
[PDF]
Mathematical simulation models are commonly applied to analyze experimental or environmental data and eventually to acquire predictive capabilities. Typically these models depend on poorly defined, unmeasurable parameters that need to be given a value. Fitting a model to data, so-called inverse modelling, is often the sole way of finding reasonable values for these parameters. There are many challenges involved in inverse model applications, e.g., the existence of non-identifiable parameters, the estimation of parameter uncertainties and the quantification of the implications of these uncertainties on model predictions. The R package FME is a modeling package designed to confront a mathematical model with data. It includes algorithms for sensitivity and Monte Carlo analysis, parameter identifiability, model fitting and provides a Markov-chain based method to estimate parameter confidence intervals. Although its main focus is on mathematical systems that consist of differential equations, FME can deal with other types of models. In this paper, FME is applied to a model describing the dynamics of the HIV virus.
▼
- Deep-Learning-Based Inverse Modeling Approaches: A Subsurface Flow Example by Wang, Nanzhe and Chang, Haibin and Zhang, Dongxiao (2021)
[DOI]
Deep-learning has achieved good performance and demonstrated great potential for solving forward and inverse problems. In this work, two categories of innovative deep-learning-based inverse modeling methods are proposed and compared. The first category is deep-learning surrogate-based inversion methods, in which the Theory-guided Neural Network (TgNN) is constructed as a deep-learning surrogate for problems with uncertain model parameters. By incorporating physical laws and other constraints, the TgNN surrogate can be constructed with limited simulation runs and accelerate the inversion process significantly. Three TgNN surrogate-based inversion methods are proposed, including the gradient method, the Iterative Ensemble Smoother method, and the training method. The second category is direct-deep-learning-inversion methods, in which TgNN constrained with geostatistical information, named TgNN-geo, is proposed as the deep-learning framework for direct inverse modeling. In TgNN-geo, two neural networks are introduced to approximate the random model parameters and the solution, respectively. In order to honor prior geostatistical information of the random model parameters, the neural network for approximating the random model parameters is first trained by using observed or generated realizations. Then, by minimizing the loss function of TgNN-geo, the estimation of model parameters and the approximation of the model solution can be simultaneously obtained. Since the prior geostatistical information can be incorporated, the direct-inversion method based on TgNN-geo works well, even in cases with sparse spatial measurements or imprecise prior statistics. Although the proposed deep-learning-based inverse modeling methods are general in nature, and thus applicable to a wide variety of problems, they are tested with several subsurface flow problems. It is found that satisfactory results are obtained with high efficiency. Moreover, both the advantages and disadvantages are further analyzed for the proposed two categories of deep-learning-based inversion methods.
▼
- ESA: The Ecological Society of America
The
Ecological Society of America (ESA) is a nonpartisan, nonprofit organization of scientists founded in 1915 to:
- promote ecological science by improving communication among ecologists;
- raise the public’s level of awareness of the importance of ecological science;
- increase the resources available for the conduct of ecological science; and
- ensure the appropriate use of ecological science in environmental decision-making by enhancing communication between the ecological community and policy-makers.
Ecology is the scientific discipline that is concerned with the relationships between organisms and their past, present, and future environments. These relationships include physiological responses of individuals, structure and dynamics of populations, interactions among species, organization of biological communities, and processing of energy and matter in ecosystems.
▼
- EUROSIM: Federation of European Simulation Societies
EUROSIM - the Federation of European Simulation Societies, provides a European forum for regional and national simulation societies to promote the advancement of modelling and simulation in industry, research and development. Under the EUROSIM umbrella, EUROSIM Member Societies and co-operating societies and groups organize conferences, produce publications on modelling and simulation, work in standardizing or technical committees, etc. Simulation Notes Europe (SNE) is the official membership journal of EUROSIM.
▼
- GfÖ - The Ecological Society of Germany, Austria and Switzerland
The GfÖ is an independent, nonprofit scientific organisation founded in 1970. We aim to:
- promote basic and applied ecological science
- encourage collaborative work of all ecological disciplines
- improve communication among ecologists in German speaking countries and beyond
- facilitate education in ecology at universities and institutes of higher education
▼
- ILTER - International long term ecological research
The International Long Term Ecological Research Network, ILTER, is a network of networks, encompassing hundreds of research sites located in a wide array of ecosystems that can help understand environmental change across the globe. ILTER's focus is on long-term, site-based research and monitoring.
ILTER’s vision is a world in which science helps prevent and solve environmental and socio-ecological problems. ILTER contributes to solving international ecological and socio-economic problems through question and problem-driven research, with a unique ability to design collaborative, site-based projects, compare data from a global network of sites and detect global trends.
Specifically, the purpose of ILTER is to provide a globally distributed network and infrastructure of long-term research sites for use in the fields of ecosystem, biodiversity, critical zone and socio-ecological research, and to secure highest quality interoperable services in close interaction with related regional and global research infrastructures and networks.
▼
- IPCC: Intergovernmental Panel on Climate Change
The Intergovernmental Panel on Climate Change (IPCC) is the United Nations body for assessing the science related to climate change.
The IPCC provides regular assessments of the scientific basis of climate change, its impacts and future risks, and options for adaptation and mitigation.
▼
- ISEM: International Society for Ecological Modelling
The International Society for Ecological Modelling (ISEM) promotes the international exchange of ideas, scientific results, and general knowledge on application of systems analysis and simulation in ecology and natural resource management. The Society was formed in Denmark in 1978 by the initiative of the professor Sven E. Jorgensen. Today ISEM has the following chapters: Africa, Australasia, Europe, North America and Japan. The society sponsors conferences, symposia, and workshops that promote the systems approach to ecological research and teaching, and to the management of natural resources. Its members frequently contribute research articles to the official scientific journal of the Society, Ecological Modelling.
▼
- SCS: The Society for Modeling & Simulation International
The Society for Modeling & Simulation International (SCS) was established in 1952 as a nonprofit, volunteer-driven corporation called Simulation Councils, Inc. Simulation Councils, Inc. became The Society for Computer Simulation which is where we derived the acronym SCS. Today, we still keep the familiar SCS acronym as a part of our identity.
SCS is the premier technical Society dedicated to advancing the use of modeling & simulation to solve real-world problems; devoted to the advancement of simulation and allied computer arts in all fields; and committed to facilitating communication among professionals in the field of simulation. To this end, SCS organizes meetings, sponsors and co-sponsors national and international conferences, and publishes the SIMULATION: Transactions of The Society for Modeling and Simulation International and the Journal of Defense Modeling and Simulation magazines. In addition to this, SCS also created The McLeod Modeling and Simulation Network (M&SNet) in 2003, which is a consortium of co-operating independent organizations active in professionalism, research, education, and knowledge dissemination in the modeling and simulation (M&S) domain. M&SNet aims to provide an organizational structure that will serve to integrate and enrich, within its organizations, modeling and simulation activities throughout the world. The M&SNet provides a framework within which organizations interested in M&S can interact, share expertise, and work on problems of common interest.
▼
- SMB: Society for Mathematical Biology
The Society for Mathematical Biology was founded in 1973 to promote the development and dissemination of research and education at the interface between the mathematical and biological sciences. It does so through its meetings, awards, and publications. The Society serves a diverse community of researchers and educators in academia, in industry, and government agencies throughout the world.
▼
- CEMC - the Canadian Environmental Modelling Centre
The Canadian Environmental Modelling Centre is a consortium of researchers at Trent University who use and/or develop models for application in the environment.
The Centre has been active for over two decades and encompasses research in a wide variety of environmental applications. The Centre fosters collaboration between modelers and experimentalists, facilitating interaction at the research planning level to help shape better experimental approaches and modelling outcomes. The Centre's breadth encompasses all types of model that relate to environmental issues.
▼
- Center for Environmental Science, University of Maryland
The University of Maryland Center for Environmental Science leads the way toward better management of Maryland’s natural resources and the protection and restoration of the Chesapeake Bay. From a network of laboratories located across the state, our scientists provide sound advice to help state and national leaders manage the environment and prepare future scientists to meet the global challenges of the 21st century.
▼
- ecotoxmodels
Ecotoxicology and conceptual, mathematical & simulation models. Toxicokinetic-toxicodynamic modelling. Discussions, papers, links, data, software. Information about the research team of Roman Ashauer at the University of York.
▼
- CEAM: EPA Center for Exposure Assessment Modeling
CEAM provides proven predictive exposure assessment techniques for aquatic, terrestrial, and multimedia pathways for organic chemicals and metals.
▼
- Hydrology and Remote Sensing Laboratory : USDA ARS
The mission of the Hydrology and Remote Sensing Laboratory is to conduct nationally oriented basic and applied research on the use of remote sensing in addressing water and soil resource concerns related to the production of food and fiber, climate change and the conservation of natural resources.
▼
- ICRAF: World Agroforestry
World Agroforestry (ICRAF) is a centre of science and development excellence that harnesses the benefits of trees for people and the environment. Leveraging the world’s largest repository of agroforestry science and information, we develop knowledge practices, from farmers’ fields to the global sphere, to ensure food security and environmental sustainability.
▼
- IGWMC - Integrated Groundwater Modeling Center
The IGWMC is an internationally oriented research, education, and information center for integrated groundwater modeling, hosted at the Colorado School of Mines. Our efforts focus on conducting research in practical, applied areas of groundwater hydrology and modeling, as well as engaging with our community through education and outreach initiatives. The IGWMC also organizes and supports short courses, workshops, and conferences to advance the appropriate use of quality-assured models in groundwater resources protection and management.
▼
- NREL: Natural Resource Ecology Laboratory at Colorado State University
NREL has pioneered research in ecosystem and watershed sciences that incorporates trans/inter-disciplinary and systems-level thinking to address basic and applied research questions, informing adaptive management frameworks and supporting policy development.
▼
- Rothamsted Research
Rothamsted Research is a world-leading, non-profit research centre that focuses on strategic agricultural science to the benefit of farmers and society worldwide.
▼
- Ecological Modelling
Ecological Modelling publishes new mathematical models and systems analysis for describing ecological processes, and novel applications of models for environmental management.
We welcome research on process-based models embedded in theory with explicit causative agents and innovative applications of existing models. And because applications can help refine models and propose new directions for research, the journal publishes both to help foster reproducibility and utility. Human activity and well-being are dependent on and integrated with the functioning of ecosystems and the services they provide. We aim to understand these basic ecosystem functions using mathematical and conceptual modelling, systems analysis, thermodynamics, computer simulations, and ecological theory, and look to a wide spectrum of applications ranging from basic ecology to human ecology to socio-ecological systems. The journal welcomes original research articles, review articles, viewpoint articles and short communications.
▼
- IEAM: Integrated Environmental Assessment and Management
Integrated Environmental Assessment and Management (IEAM) is published six times a year by the Society of Environmental Toxicology and Chemistry (SETAC). The journal is devoted to bridging the gap between scientific research and the application of science in decision making, policy and regulation, and environmental management. IEAM aims to be the premier scientific journal for presenting new information, promoting dialogue, and fostering new methods for the analysis of ecological, chemical, engineering, physical, and social science research applied to the advancement of environmental management strategies, policy and regulation, and problem solving.
▼
- Journal of Environmental Economics and Management
The Journal of Environmental Economics and Management publishes theoretical and empirical papers devoted to specific natural resource and environmental issues. To warrant publication in JEEM papers should address new empirical findings that are of interest to a broader audience, theoretical analyses explaining new phenomena or puzzles, or development of theoretical or empirical methods likely being useful for further research.
▼
- Journal of Theoretical Biology
The Journal of Theoretical Biology is the leading forum for theoretical perspectives that give insight into biological processes. It covers a very wide range of topics and is of interest to biologists in many areas of research, including:
- Brain and Neuroscience
- Cancer Growth and Treatment
- Cell Biology
- Developmental Biology
- Ecology
- Evolution
- Immunology,
- Infectious and non-infectious Diseases,
- Mathematical, Computational, Biophysical and Statistical Modeling
- Microbiology, Molecular Biology, and Biochemistry
- Networks and Complex Systems
- Physiology
- Pharmacodynamics
- Animal Behavior and Game Theory
▼
- Mathematical Biosciences
Mathematical Biosciences publishes work providing new concepts or new understanding of biological systems using mathematical models, or methodological articles likely to find application to multiple biological systems. Papers are expected to present a major research finding of broad significance for the biological sciences, or mathematical biology. Mathematical Biosciences welcomes original research articles, letters, reviews and perspectives.
▼
- Nature
First published in 1869, Nature is the world’s leading multidisciplinary science journal. Nature publishes the finest peer-reviewed research that drives ground-breaking discovery, and is read by thought-leaders and decision-makers around the world.
▼
- Science
Science is a leading outlet for scientific news, commentary, and cutting-edge research. Through its print and online incarnations,
Science reaches an estimated worldwide readership of more than one million. Science’s authorship is global too, and its articles consistently rank among the world's most cited research.
Science serves as a forum for discussion of important issues related to the advancement of science by publishing material on which a consensus has been reached as well as including the presentation of minority or conflicting points of view. Accordingly, all articles published in Science—including editorials, news and comment, and book reviews—are signed and reflect the individual views of the authors and not official points of view adopted by AAAS or the institutions with which the authors are affiliated.
▼
- Simulation Modelling Practice and Theory
The journal Simulation Modelling Practice and Theory provides a forum for original, high-quality papers dealing with any aspect of systems simulation and modelling.
• theoretical aspects of modelling and simulation including formal modelling, model-checking, random number generators, sensitivity analysis, variance reduction techniques, experimental design, meta-modelling, methods and algorithms for validation and verification, selection and comparison procedures etc.;
• methodology and application of modelling and simulation in any area, including computer systems, networks, real-time and embedded systems, mobile and intelligent agents, manufacturing and transportation systems, management, engineering, biomedical engineering, economics, ecology and environment, education, transaction handling, etc.;
• simulation languages and environments including those, specific to distributed computing, grid computing, high performance computers or computer networks, etc.;
• distributed and real-time simulation, simulation interoperability;
• tools for high performance computing simulation, including dedicated architectures and parallel computing.
▼
- SNE: Simulation Notes Europe
Simulation Notes Europe (SNE) provides an international, high-quality forum for presentation of new ideas and approaches in simulation - from modelling to experiment analysis, from implementation to verification, from validation to identification, from numerics to visualisation - in context of the simulation process.
SNE seeks to serve scientists, researchers, developers and users of the simulation process across a variety of theoretical and applied fields in pursuit of novel ideas in simulation and to enable the exchange of experience and knowledge through descriptions of specific applications. SNE puts special emphasis on the overall view in simulation, and on comparative investigations, as benchmarks and comparisons in methodology and application. Additionally, SNE welcomes also contributions in education in / for / with simulation.
▼
- Springer: Environmental Sciences: Books and Journals
Explore Springer’s books and journals in Environmental Sciences, including reference works, textbooks, and open access. Free shipping worldwide.
▼
- Theoretical Ecology
Theoretical Ecology publishes innovative research in theoretical ecology, including ecophysiology, population ecology, behavioral ecology, evolutionary ecology, ecosystem ecology, community ecology, and ecosystem and landscape ecology. The editors emphasize work that bridges disciplinary boundaries, such as the intersection between quantitative social sciences and ecology, or physical influences on ecological processes.
The contents include regular articles, review papers, and rapid communications. Review papers will be published after a thorough review process; please contact the reviews editor before submission. Theoretical Ecology also offers a rapid communications option for short manuscripts of unusually broad interest to the ecological community.
▼
- Theoretical Population Biology
An interdisciplinary journal, Theoretical Population Biology presents articles on theoretical aspects of the biology of populations, particularly in the areas of demography, ecology, epidemiology, evolution, and genetics. Emphasis is on the development of mathematical theory and models that enhance the understanding of biological phenomena.
Articles highlight the motivation and significance of the work for advancing progress in biology, relying on a substantial mathematical effort to obtain biological insight. The journal also presents empirical results and computational and statistical methods directly impinging on theoretical problems in population biology.
▼