Robert Axtell, The Brookings Institution
In the past generation, macroeconomists have moved from phenomenological relationships between macrovariables,
estimated econometrically, to microfoundations of macrovariables that have at least some plausible behavioral
content. This research program has led to the ‘dynamic stochastic general equilibrium’ paradigm that obtains
in macroeconomics today. In this talk I shall report on early efforts that seek alternative microfoundations
for macroeconomics, using large-scale agent-based computational modeling. At least three independent projects
of this type exist today, most in the formative stages. Each seeks behaviorally-credible microscopic
specifications that are sufficient to produce macrovariables having empirically-relevant magnitudes and
fluctuations. I shall report in some detail on one of these, the MASs Macro project, which hopes to realize
100 million agent models in the next few years, with agents forming firms, making products, earning wages,
saving, investing and consuming. I shall also suggest the definite way in which emergent macrovariables,
like interest rates and prices, feed back to the micro level in such models. Such variables are often treated
as exogenous in microeconomics, suggesting there is a meaningful sense in which such models might ultimately
provide credible macrofoundations of microeconomics.
Chris Barrett, Virginia Tech
There is an increasing disconnect between traditional scientific disciplines, in part due to the increasing
influence of technology and society on the problems we confront. This effect is particularly evident in systems
such as “socio-technical, ” “socio-economic,” “bio-social,” and “bio-technical” that inherently cross
traditional boundaries. New concepts and research tools are necessary to bridge the growing divide. The promise
of a Complexity Science is best viewed as a general topic area that helps to fulfill such a need.
Within social, economic and biological disciplines in particular, a frequent divergence of the interests
of researchers and practitioners is due to a commitment to symmetries and generalities on the one hand and
irregularities and particularities on the other. Concern for irregularity often underlies the complexity
concepts useful to bridging the gap. Indeed important practical and theoretical problems can possess a lot
of essential detail or unavoidable particulars as well as a good deal of general knowledge and concepts that
have analogies in many disciplines. Theoreticians and modelers hoping to make their results relevant cannot
ignore this.
Our approach to these problems has focused on composed interaction-based systems to capture both
generalities and detail: their practical motivation and use, formal characterization and their computation.
Agent-oriented simulation is an example of an interaction based computational technique useful to biological
and social systems. Scaling to very large systems raises important computational and conceptual issues such
as the necessity of the particular approach as well as its feasibility.
This talk will address some of the common issues that come up in interaction-based approaches to large
systems. The development of information-support environments for public health planning related to epidemics
of infectious diseases provides a useful setting to illustrate and examine some of these issues. Agent based
simulation of epidemics for the purpose of mitigation planning will be discussed in this context.
TOP
R. Stephen Berry, The University of Chicago
This will cover two areas, Energy and Education, especially Science Literacy. First, Energy: Tested methods
of analyzing resource use expose situations that offer opportunities for innovations that could have very
large impact, e.g. on efficient use of energy. These have been largely "tree" approaches, of which an example
or two will be shown. On the other hand, a more powerful approach still awaits development. Vassily Leontief
introduced input-output analysis as a matrix approach to following resource use, but, under pressure of WWII
conditions, developed the method in monetary terms. It remains an open challenge to create an input-output
analysis based on physical quantities. Such an approach would open the way to simulate the effects of changing
technology, as well as representing the resource flows of current practice. On a more detailed level,
optimization by methods of finite-time thermodynamics and optimal control sometimes reveal new possibilities
for choices of control variables and hence for innovative design of improved processes. Second, education and
especially science education offers a great challenge for computer design and the internet. It should be
possible to design an interactive means by which the student-user would provide an answer to each new question
and receive, as response, an animation showing the consequences of the answer. The student could then change
the answer, see the new consequences and continue until the student had learned not only what the right answer
would be, but why. This approach will require very thoughtful formulation of the questions, to make such
interaction possible, and must be done in an engaging manner. At least one example will be given, but not as
a worked-out model.
J. Doyne Farmer, Santa Fe Institute
The standard neoclassical approach to doing economics assumes selfish, rational optimizing behavior. While
this approach has dominated economics for the last fifty years, it is coming increasingly under challenge and
new approaches are proliferating, to the point that according to Ken Arrow, "economics is in chaos". How does
one construct theories in economics that make sharp falsifiable predictions? I will give some examples based
on studies of data from financial markets. The methods used are in the spirit of statistical mechanics,
treating agents as slightly intelligent ping-pong balls. I will discuss two applications: (1) Predicting
the shape of supply and demand curves in financial markets, and (2) predicting the distribution of price
changes. The second application involves the construction of a very simple, empirically grounded agent
based model, which suggests the existence of an underlying equation of state relating the flow of trading
orders to prices. By understanding what depends on intelligence and what doesn't, this approach helps make
it clear what depends on behavior vs. what depends on institutions, and results in accurate quantitative
predictions.
TOP
Alexander Outkin and Silvio Flaim, Los Alamos National Lab
An overview of LANL’s Financial System Infrastructure Model (FinSim) is presented. FinSim represents
the U.S. financial services sector as a complex decentralized system with multiple interacting autonomous
decision nodes, or agents. Those nodes represent different types of real-world agents, such as banks,
traders, markets, and brokers. Each agent has its own decision-making rules or capabilities, ability to
retrieve and process information, ability to execute their decisions, and ability to interact with other
agents or systems. In our approach, financial system interactions are executed through an explicit
message exchange, intermediated by the telecommunications system with electric power dependencies for
the purpose of investigating possible vulnerabilities. FinSim is implemented in Java and uses RePast
as an underlying agent-based framework. An overview of the simulation architecture is presented along
with a discussion of a hypothetical crisis scenario involving the FedWire payment system and Federal
Funds market and interactions between those entities during a crisis.
Robert Goldstein, Electric Power Research Institute
The presentation will examine scientific, technological, economic and societal implications of
increasing demands for electric power and water on regional and watershed levels. Special emphasis
will be placed on the arid and semi-arid western U.S. Questions will be raised regarding our ability
to create socio-economic models to address issues related to electric power/water sustainability.
TOP
Rajan Gupta, Los Alamos National Lab
Today, for the first time in history, people are conceiving the possibility that every child can be
provided the opportunity to develop to her/his full potential and not be limited by the circumstances of
their birth. This talk reviews five grand challenges that we need to overcome in order to make this desire
a reality: (i) population stabilization, (ii) global public health, (iii) environment and water resource
management, (iv) cheap clean energy, and (v) providing a nurturing and educational environment for children
for 22 years or more. Each of these, and their interactions, constitute highly complex non-linear systems.
This talk highlights the need for politics, markets, science, technology and the public to work together,
and for scientists to work towards a full system analysis.
Ira M. Longini Jr., Fred Hutchinson Cancer Research Center and University
of Washington
The world appears to be on the brink of a deadly influenza pandemic. Recent human deaths due to infection
by highly pathogenic avian influenza A (H5N1) virus have raised the specter of a devastating pandemic like
that of 1917 - 18, should this avian virus evolve to become readily transmissible among humans. It is
optimal to contain a nascent strain of influenza at the source. If this fails, then the best strategy is
to slow spread until a well-matched vaccine can be made and distributed. In this talk, I describe
large-scale stochastic simulation models to investigate the spread of a pandemic strain of influenza
virus both at the source and throughout the US. We model the impact that a variety of levels and
combinations of influenza antiviral agents, vaccines, and modified social mobility (including school
closure and travel restrictions) have on the timing and magnitude of this spread.
Ben Luce, Los Alamos National Laboratory
Renewable energy technologies are coming of age, and seeing rapidly increasing deployment around the world
for a variety of reasons, presenting a number of new challenges for planners. This presentation will cover t
he state of the art of wind power, concentrating (utility-scale) solar, distributed solar and biomass power
technologies. Following this, the current and future deployment of these in the Southwest will be discussed,
with an emphasis on the emerging issues of integrating large percentages of these new resources into the
existing grid, including transmission, capacity factor, load following, energy storage, and other factors.
TOP
Ed MacKerrow, Los Alamos National Lab
Political Islamist opposition organizations adopt different strategies to effect social and political
changes. Different Islamist organizations pursue different means to achieve the shared common end goal
of establishing political and social systems based strictly on Islamic tradition. The strategies followed
and tactics employed vary across Islamist organizations, within a given organization over time, and
across sub-organizations, or cells, of broader Islamic movements. Islamists organizations adapt their
strategies relative to the policies and actions of the regime and the strategies of competing opposition
movements. Meanwhile the regime and other competing movements adapt their own strategies relative to those
of the Islamists. The type and level of repression utilized by the regime helps determine whether violent
strategies will be pursued by the opposition Islamist organizations. An interesting question is whether or
not peaceful transitions to Islamist theocracies are possible in Muslim states governed by repressive
regimes, and if so what those transition dynamics would look like? In order to gain a better understanding
of the dynamics involved these complex and adaptive social-political systems we are developing a variety
of models and social simulations to examine different scenarios of how groups may adopt to varying levels
of repression, competition for political support, and grievances. In this model we describe the
social-science models used to develop these and discuss some preliminary insights gained from this
ongoing research.
David A. O'Brien, Los Alamos National Laboratory
During the 2005 Hurricane season, many consequence predictions were available to key Federal agencies from
36 to 96 hours before each of the major hurricane US mainland landfalls. These key forecasts included the
location and intensity of the hurricane at landfall, areas of significant damage to engineered infrastructure
and lifeline utilities, time estimates to restore critical infrastructure services, and the conditions to be
found on the ground as emergency and relief crews enter the area. Both the Department of Energy through its
Visualization and Modeling Working Group and the Department of Homeland Security provided early forecasts of
potential damage to the regionally critical infrastructures. These products communicated critical information
that assisted in the decision-making process for emergency planning. This same methodology can provide some
insight about infrastructure adaptation to less acute but no less extreme changes, such as climate change and
environmental damage and the resilience of both the engineered infrastructures (energy, transportation,
communications) and non-engineered systems (social, political, economic, demographic, geographic) when faced
with perceived threats.
Paul Ormerod, Volterra Consulting, London, UK
Both within complexity science and within economics, we can identify two separate approaches to modelling
and understanding empirical phenomena.
In complexity science, one approach is to build high dimensional models which attempt to capture may detailed
features of reality. In contrast, the alternative approach is to build very low dimensional models which make
considerable abstractions.
In economics, the standard socio-economic model (SSSM) postulates very considerable cognitive powers on the
part of its agents. They are able to gather all relevant information in any given situation, and to take the
optimal decision on the basis of it, given their tastes and preferences. Empirical work in economics over the
past 20 years or so has shown that in general these behavioural postulates lack empirical validity. Instead,
agents appear to have limited ability to gather information, and use simple rules of thumb to process the
information which they have in order to take decisions.
In the paper, I provide examples of the ability of low dimensional models with low cognition agents to
understand social and economic phenomena. Examples include economic recessions, the distribution of crime
rates across individuals, and why the probability of near-global failure or extinction increases as a
complex system becomes more connected.
TOP
Alexander Outkin, Los Alamos National Lab
An overview of the joint LANL-NMSU social dynamics modeling project is presented. The main premise of the
project is that social systems may exhibit a wide range of behaviors within similar circumstances: for example,
phase transitions from normal state to orderly evacuation, or to a breakdown in law in order during a crisis.
Using an evacuation scenario as an example the project aims to achieve the following goals: 1) create a model
of population decision-making during crises; 2) connect such a model to existing population mobility modeling
tools such as TRANSIMS; 3) identify possible levers and interventions that may affect aggregate population
behaviors. The project got officially underway in June 2006. We present our current findings to date, including
an overview of approaches to crisis decision modeling and a proposed social dynamics model architecture.
Alan Perelson, Los Alamos National Laboratory
I will provide a brief overview of the use agent-based models in immunology and show how they have given
insights into cell-mediated immune responses, influenza infection, and vaccination strategies for influenza.
TOP
D. V. Rao, Los Alamos National Laboratory
Infrastructure is the foundation of economic growth and prosperity. Infrastructure is also the means to deliver
goods and services to global populations imperiled by natural disasters and terrorist events. The presentation
includes a conceptual description of a proposed modeling effort by LANL and potential partners to capture global
infrastructure interdependencies and to measure the contribution of infrastructure development to stabilize
failed and failing nation-states. Relationships to recent Presidential and Department of Defense Directives
are identified, and potential solutions to the methodological challenges are discussed.
Sid Redner, Boston University
When does consensus form and when is there perpetual clash in rational socially-interacting populations?
We investigate this question for generic opinion dynamics models. We first review basic results for the
classic voter model on regular lattices. For heterogeneous social networks, we show that consensus is
achieved more readily than on regular lattices due to the influence of individuals at network hubs. We
then discuss the competition between consensus and clash in a population of leftists, centrists, and
rightists that evolve by voter model dynamics, with the constraint that extremists of the opposite persuasion
do not interact. Finally, we discuss the rich behavior of the Axelrod model, where a transition between
consensus and cultural diversity is controlled by an anomalously long time scale and diversity is achieved
via a non-monotonic route.
Antonio Redondo, Los Alamos National Lab
Pathomics is a term that refers to the interactions between pathogens and their hosts. In this talk I will
discuss the need for theory, modeling and simulation in pathomics as a grand challenge. The presentation
will include an overview of comparative, multi-regime strategies to study different pathogens from the molecular
to the population levels.
TOP
Martin Short, National Renewable Energy Lab
This presentation will briefly describe the Wind Deployment Systems model (WinDS) and some of the studies
conducted with it. WinDS is a model of capacity expansion in the U.S. electric sector. It minimizes system
wide costs of meeting electric loads, reserve requirements, and emission constraints by building and operating
new generators and transmission in 26 two-year periods from 2000 to 2050. The primary outputs of WinDS are the
amount of capacity and generation in each region of each type of prime mover—coal, gas combined cycle, gas
combustion turbine, nuclear, wind, etc.—in each 2-year period.
While WinDS includes all major generator types, it was designed primarily to address the market issues of
greatest significance to wind—transmission and resource variability. The WinDS model examines these issues
primarily by using a much higher level of geographic disaggregation than other models. WinDS uses 358 wind
supply regions in the continental United States which are aggregated into three levels of large regional
groupings—the power control areas (PCAs), North American Electric Reliability Council (NERC) regions, and
national interconnect regions.
WinDS has been used to evaluate the impact on wind deployment of production tax credits, carbon taxes, high
natural gas prices, state incentives, plug-in hybrid electric vehicles, etc. A few of these results will
be presented.
Eric Smith, Santa Fe Institute
The continuous double auction, which is the clearing mechanism of most major financial markets, provides a
rich enough institutional structure that agent intentions are often subordinate to what the clearing mechanism
requires or forbids. Such institutions can be fertile grounds for Zero Intelligence (ZI) modeling, where agent
intentions are entirely removed and a pure model of institutional dynamics and constraint is analyzed. ZI
modeling allows us to extract quantitative consequences of partial system specifications, and constitutes a
domain in social science where physics methods are likely to be useful. The failures of ZI models can also
direct the search for non-institutionally generated regularities of behavior. I will show some simple results
of the application of dimensional analysis to a ZI model of the continuous double auction, where both the
successes and failures are striking and informative.
TOP
R. C. Vierbuchen, ExxonMobil Exploration Company
An evaluation of global production history and the global resource base suggests that a peak in global liquids
production, resulting solely from a resource-base limitation, is unlikely to occur in the next 25 years.
Furthermore, it appears that Hubbert’s (1956) method, made famous by his correct prediction in 1956 that U. S.
Lower-48 oil production would peak in the late 1960’s or early 1970’s, is not readily applicable to forecasting
global liquids production. The following observations support these conclusions:
- Estimates of the liquids resource base have increased over the last 50-100 years, and are likely to
continue to do so. Forecasts of an imminent peak in global production appear to underestimate major sources
of growth in the resource base, particularly improved recovery and resources made economic by new capabilities.
Hubbert’s method does not encompass the timing or the volume of future increases in the resource base.
- Although annual global production has exceeded annual discoveries since the early 1980’s, annual global
reserve adds still exceed annual production because of reserve growth in existing fields.
- Advances in technology are increasing recovery, opening new producing areas, and lowering thresholds,
and thereby changing estimates of the resource base and production outlook.
- Non-OPEC supply has grown steadily for the last ten years, and continued growth for at least the next
five to ten years is highly likely, based on new development projects underway or planned. OPEC countries have
numerous opportunities to increase production.
- Nations with the largest liquids resources typically have production histories with long-term restraints
and interruptions in production that are not envisioned in Hubbert’s method.
- Sources of conventional liquids other than crude oil, such as condensate, NGL’s, GTL, and refinery gains,
are growing, and typically excluded from applications of Hubbert’s method.
- Production from “unconventional” sources, such as very heavy oil, bitumen, and shale oil is growing, and
often overlooked in global forecasts of peak production based on Hubbert’s method.
- The interactions among supply, demand, and price cause demand growth to slow as supply tightens, and bring
on new sources of supply.
- Current tightness in liquid supplies results from rapid demand growth and interruptions to supply, not
from a decrease in supply.
- Many previous predictions of a peak in global production, based on Hubbert’s method, dating back to
Hubbert’s own prediction (made in 1969, for a peak in 2000) have been proven wrong.
Focus on the application of Hubbert’s method to predicting global peak production has distracted attention from
important questions regarding the global liquids resource base, such as these: (1) What improvements in
technology are likely to provide the largest improvements to supply and supply cost? (2) What factors limit
growth in global liquids supply, today and in the future? (3) What alternative methods can be applied to
better assess the global resource base and the multitude of factors that influence the rate of resource consumption?
Kenneth Werely, Los Alamos National Lab
The optimal system corrective response to damaged electric power generation and transmission networks is
computed to minimize the shedding of customer load while maintaining conditions within acceptable operating
regimes. Optimal dispatching of real and reactive power sources, (e.g., generator, phase-shifting transformer
and intertie power injection for real power; and generation, variable transformers, static Var compensators,
variable capacitor and reactor banks for reactive power) is performed to eliminate (or reduce) system
violations such as line overloads and low voltages. Component disabling to prevent additional (self-) damage
is taken into account. Shedding the minimal amount of customer load to eliminate any remaining violations is
computed. Time-dependent system evolution, projected repair and restoration times, and estimated evolving
service and outage areas are also computed. Affected resources within outage areas can be totaled or
identified. During the recovery-from-damage phase, algorithms enable previously disabled transmission
components, including reconnecting "brown" or "black" electrical islands and recovering islanded load or
load previously shed to relieve violations. The above techniques can be applied to simulate cascade
failures, temporal demand variation effects, and rotating blackouts, as well as to identify critical assets,
potential system weaknesses, and linking electric power effects to other infrastructures. Applying optimal
system corrections to electric transmission problems can save lives, money, productivity, and quality of life.
TOP
Cathy Wilson, Los Alamos National Lab
Rapid population growth and severe drought are impacting water availability for all sectors (agriculture,
energy, municipal, industry…), particularly in arid regions. New power generation decision support tools,
incorporating recent advances in informatics and geographic information systems (GIS), are essential for
responsible water planning at the basin scale. The ZeroNet water-energy initiative developed a decision
support system (DSS) for the San Juan River Basin, with a focus on drought planning and economic analysis.
The ZeroNet DSS provides a computing environment (cyberinfrastructure) with three major components:
Watershed Tools, a Quick Scenario Tool, and a Knowledge Base. The Watershed Tools, based in the Watershed
Analysis Risk Management Framework (WARMF), provides capabilities 1) to model surface flows, both the
natural and controlled, as well as water withdrawals, via an engineering module, and 2) to analyze and
visualize results via a stakeholder module. A new ZeroNet module for WARMF enables iterative modeling
and production of "what if" scenario libraries to examine consequences of changes in climate, landuse,
and water allocation. The Quick Scenario Tool uses system dynamics modeling for rapid analysis and
visualization for a variety of uses, including drought planning, economic analysis, evaluation of
management alternatives, and risk assessment. The Knowledge Base serves simultaneously as the "faithful
scribe" to organize and archive data in easily accessible digital libraries, and as the "universal
translator" to share data from diverse sources and for diverse uses. All of the decision tools depend
upon GIS capabilities for data/model integration, map-based analysis, and advanced visualization. The
ZeroNet DSS offers stakeholders an effective means to address complex water problems.
Erla Zwingle, National Geographic Society
Cities are the fundamental building blocks of prosperity, as experts have stated, both for nations and for
families. The statistics bear this out. But as explosive urban growth worldwide over the past decade has
highlighted many problems, it has also shown many creative solutions. The world's mega-cities are not, as
they might appear, nothing more than overloaded freighters with no rudder and a large hole in the hull.
Many solutions are created, often spontaneously, by the poorest people in the city: The ones who have the
most to gain and nothing to lose. Field work for three major articles for National Geographic magazine has
taken me to a wide variety of places in which men and women are showing courage, ingenuity, and hope in
dealing with the problems of their lives. The themes of these three projects, Population (October 1998),
Globalization (August 1999), and Urbanization (November 2002) are all tightly intertwined. My presentation
will describe some of the most difficult situations I have seen and some of the people and programs which
are making them better.
TOP