Friday, March 23, 2012

Modeling the Future: Tools from Complex Systems

2012 and Beyond: Tools for Predicting the Next Sixty Years


Is the world coming to an end on Dec. 21st, 2012, or not?

Very likely, not.

We'll still wake up in the morning, in the same beds in which we went to sleep in the night before. We'll still walk out to our cars, or get to our Metro stations, on time. And we'll likely stop for the same "cup of joe" on the way to work.

But will our future, in any way, be significantly different from what we're experiencing now?

My best-guess, and carefully-advised, answer to this is "yes." But "yes" with a qualifier.

And that qualifier is: the difference(s) may be a bit difficult to discern.

A case in point. A certain type of solid undergoes a phase transition. Specifically, some "solid electrolytes" (e.g., silver iodides) undergo a phase transition at a certain temperature. Below this temperature (420 K for AgI), the solid is not a conductor of electricity. Above this temperature, and it conducts electricity; the Ag+ ions have become mobile.

Before the phase transition, and after it, the AgI is still a physical "solid." It hasn't melted into a puddle on the floor. It hasn't gotten all waxy or goopy. But after the phase transition, its functionality changes a great deal.

The reason? We can model this substance as two intermeshed structures. The anion structure (the negative ions, or I-) is b.c.c. (body-centered cubic). The cations (the positive ions, or Ag+) are distributed among six possible "sites" for each anion. Clearly, there's some room to move. And when this substance gets hot enough, the silver ions do indeed move about.

Up until the 420K phase transition, not much seems to be happening. Just as when we heat water; just before boiling, we put in more and more heat energy, and nothing happens. (The "watched pot" isn't boiling.) Then, when we've finally put in enough heat energy, it boils - the phase transition occurs. And we go from something that has some degree of cohesion (water droplets stick together), to something with very little cohesion (steam).

In the silver iodide case, we still have apparent cohesion. There is still a structural lattice, formed of the relatively immobile large iodine ions. The iodine ions, almost twice as large as the silver ions, create a 3-D "mesh" through which the now mobile silver ions can slide. And given sufficient heat, that's what they do. A current can now pass through this structurally-stable substance. (For the AgI heat capacity graph see "Heat capacity, thermodynamic properties, and transitions of silver iodide," by R. Shaviv et al., M-2363, J. Chem. Thermodynamics, 1989, 21, 631-651.)

So why is this relevant to human history, both past and forthcoming?

Because - by way of analogy - our society can be compared to a solid electrolyte phase transition.

We'll still have houses and beds, and cars, and our favorite morning "cup of joe." That's our stable anionic structure. But moving through it, rapidly, we have such things as: stock price fluctuations, news reports, companies forming, dying, and merging, and countless other info-items. The infosphere penetrates and moves through our physical sphere.

Practical application? Much of our tangible physical environment will stay the same. However, we're approaching a series of "phase transitions" in our infosphere. We've been putting in enough "energy," over the past decades. We've put in research dollars and manufacturing dollars, and we've put in lots and lots of human hours creating the data/info that moves through our infosphere. That is, over these past few decades, a lot of energy. And we're building up to a series of "phase transitions" in our infosphere.

For details on these phase transitions, consult my previous blogpost Going Beyond Moore's Law referencing a good, scholarly article on upcoming phase changes in the IT space.

Modeling these phase changes in our infosphere is important, if we're to have a handle on our future. The phase transitions in very simple physical systems, such as AgI, are a starting point for giving us analogies. This means that they are not really precise models of what's going on, but rather, they function as stories - means by which we can describe what is happening to ourselves.

From the very simple "stories" of phase changes in physical systems, we move on to understand more complex - but still "modelable" - systems. Cellular automata, neural networks, and genetic algorithms all give us useful tools by modeling different aspects of complex biological systems. We can also study these systems themselves, following works such as Self-Organization in Biological Systems, by Camazine et al.




As we gain a solid modeling and "story-telling" toolset, we can apply these tools to the world around us. Getting a sense of our hisory - not just names and dates, but rather forces, patterns, and emergent systems - is the next step. Recently, some authors have been giving us such insightful studies. For example, read Fukuyama's works, e.g., Origins of Political Order. (And for a commentary and overview, see a John Reilly's critique of Fukuyama's Origins of Political Order.)

Tuesday, February 7, 2012

Modeling Focus for 2012 and Beyond: Carrying Capacity and Convergent Events

Carrying Capacity, Peak Oil, Convergent Event Streams


Probably the most insightful work right now is being done by Paul Chefurka, as released in his blogpost series. I found his work while laying a foundation for my modeling efforts, and his arguments seem rational and sound. One particularly important blogpost deals with Population - the Elephant in the Room.

Friday, December 9, 2011

Good Read on Modeling Social Emergent Phenomena - But Still Not There Yet!

Philip Ball - Critical Mass


The most important thing we can do right now - given the huge changes ahead of us - both in society, the world, and technology - is to get some sort of "handle" on what's coming up. By that, I mean a good set of models.

And as a result, I'm on a search for good models. Those that I know, those that are new. Those that make sense, and those that don't. (We need to relegate them to the "don't work" bin - but we need to know what we're relegating where.)

I'm starting to re-invigorate my modeling, and to connect with others about this. And along these lines, a dear colleague recommended one of his favorite books - Philip Ball's Critical Mass. I've had a good look at the Amazon "Look Inside" feature, which offers both the intro and first chapter, and the notes/references at the end.



Overall this book is great - I'm going to get a copy (from my public library, of course!) - but - it just doesn't go far enough.

Don't get me wrong. I'm all in favor of books that lay the groundwork and set the stage. Critical Mass definitely serves this need. However, we'll need to actually go beyond what is offered and discussed here to get what we really need right now: A robust set of very basic but useful models, with a clear set of what models apply to what situations, what assumptions and constraints have to be made, how we interpret the model variables, and what the model parameters actually mean.

And oh yes. This is what differentiates this new set of models and modeling tools from the previous generation. These models need to deal with nonlinear and (fairly often) non-equilibrium systems.

That said, Critical Mass looks well worth the read, and I've already looked up several of the references, and either read them online or plan to get the books.

Good job, Philip! And thank you!

Thursday, December 8, 2011

Analytic Single-Point Solution for Cluster Variation Method Variables (at x1=x2=0.5)

Single-Point Analytion CVM Solution Involves Solving Set of Nine Nonlinear, Coupled Equations


The Cluster Variation Method, first introduced by Kikuchi in 1951 ("A theory of cooperative phenomena," Phys. Rev. 81 (6), 988-1003), provides a means for computing the free energy of a system where the entropy term takes into account distributions of particles into local configurations as well as the distribution into "on/off" binary states. As the equations are more complex, numerical solutions for the cluster variation variables are usually needed. (For a good review, see Yedidia et al., Constructing free energy approximations and generalized belief propagation algorithms.

When allowed to stabilize, the system comes to equilibrium at free energy minima, where the free energy equation involves both an interaction energy between terms and also an entropy term that includes the cluster variables. This computation addresses a system composed of a single zigzag chain.

I have computed an analytic solution for representing one of the cluster variables, z3, as a function of the reduced interaction energy term:

The equation details are presented in a separate Technical White Paper; I'll include a link to it as soon as I post it on my website, http://www.aliannamaren.com.


This pattern of CVM variables follows what we would expect.

The point on this graph where h=1 (the x-axis is 10) corresponds to h = exp(beta*epsilon)=1. Effectively, beta*epsilon => 0. This is the case where either the interaction energy (epsilon) is very small, or the temperature is very large. Either way, we would expect - at this point - the most "disordered" state. The cluster variables should all achieve their nominal distributions; z1=z3=0.125, and y2=0.25. This is precisely what we observe.

Consider the case of a positive interaction energy between unlike units (the A-B pairwise combination). The positive interaction energy (epsilon>0) then suggests that a preponderance of A-B pairs (y2) would destabilize the system. We would expect that as epsilon increases as a positive value, that we would minimize y2, and also see small values for those triplets that involve non-similar pair combinations. That is, the A-B-A triplet, or z3, approaches zero. We observe this on the RHS of the above graph. This is the case where as h = exp(beta*epsilon) moves into the positive range (0-3), we see that y2 and z3 fall towards zero. In particular, z3 becomes very small. Correspondingly, this is also the situation in which z1 = z6 becomes large; we see z1 taking on values > 0.4 when h > 2.7.

This is the realm of creating a highly structured system where large "domains" of like units mass together. These large domains (comprised of overlapping A-A-A and B-B-B triplets) stagger against each other, with relatively few instances of "islands" (e.g., the A-B-A and B-A-B triplets.)

Naturally, this approach - using a "reduced energy term" of beta*epsilon, where beta = 1/(kT), does not tell us whether we are simply increasing the interaction energy or reducing the temperature; they amount to the same thing. Both give the same resulting value for h, and it is the effect of h that we are interested in when we map the CVM variables and (ultimately) the CVM phase space.

At the LHS of the preceding graph, we have the case where h=exp(beta*epsilon) is small (0.1 - 1). These small values mean that we are taking the exponent of a negative number; the interaction energy between two unlike units (A-B) is negative. This means that we stabilize the system through providing a different kind of structure; one which emphasizes alternate units, e.g. A-B-A-B ...

This is precisely what we observe. The pairwise combination y2 (A-B) actually increases slightly beyond its nominal expectation (when there is no interaction energy), and goes above 0.25, notably when h is in the range of 0.1 and smaller. Also, as expected, the value for z1 (A-A-A triplets) also drops towards zero - triplets of like units are suppressed when the interaction energy between units is positive.

Somewhat surprisingly, z3 (A-B-A triplets) also decreases as h approaches 0.1. This means that the increase to above-nominal distributions for the CVM variable goes to z2 (A-A-B). Given that this is an even distribution of A and B units (x1 = x2 = 0.5), another way to think of the far LHS is when the temperature is very large. (We then have the exponent of a negative interaction energy over a large temperature, and can think of the increased temperature as producing greater "disorder" in the system - moving us away from the highly structured A-B-A-B-A order that would otherwise exist if y2 (A-B) predominated with no other influence.

Wednesday, December 7, 2011

"Nonadditive Entropy" - An Excellent Review Article

New Advances in Entropy Formulation - "Nonadditive Entropy"


Well, chalk it up to being newly returned to the fold - after years of work in knowledge discovery, predictive analysis, neural networks, and sensor fusion, I'm finally returning to my roots and re-invigorating some previous work that involves the Cluster Variation Method. In the course of this, I've just learned (as a Janie-come-lately) about the major evolution in thinking about entropy, largely led by Constantino Tsallis. He has an excellent review paper, The nonadditive entropy Sq and its applications in physics and elsewhere: some remarks. Beautifully done; elegantly leads the reader through the somewhat complex and subtle arguments leading to major breakthroughs in entropy formulation.

Sunday, November 27, 2011

Modeling Trends in Long-Term IT as a Phase Transition

The most reasonable model for our faster-than-exponential growth in long-term IT trends is that of a phase transition.

At a second-order phase transition, the heat capacity becomes discontinuous.




The heat capacity image is provided courtesy of a wikipedia site on heat capacity transition(s).

L. Witthauer and M. Diertele present a number of excellent computations in graphical form in their paper The Phase Transition of the 2D-Ising Model.

There is another interesting article by B. Derrida & D. Stauffer in Europhysics Letters, Phase Transitions in Two-Dimensional Kauffman Cellular Automata.

The divergent increase in heat capacity is similar in form to the greater-thean-exponential increase in IT measurables, as discussed in my previous post, Going Beyond Moore's Law and identified in Super-exponential long-term trends in IT.

In one of my earlier posts, starting a modeling series on phase transitions from metastable states (using the Ising model with nearest-neighbor interactions and simple entropy), I identified a key challenge in identifying what it was that we were attempting to model. That is, What is x?. When we identify what it is that we are trying to model, we can figure out the appropriate equations.

Now, we have the same problem - but in reverse! We have an equation - actually, an entire modeling system (the Ising spin-glass model works well) - that gives us the desired heat capacity graphs. What we have to figure out now is: What is it exactly that is being represented if we choose the "phase transition analogy" for interpreting our faster-than-exponential growth in IT (and in other realms of human experience)?

That will be the subject of a near-term posting.

(Another good heat capacity graph is viewable at: http://physics.tamuk.edu/~suson/html/3333/Degenerate_files/image108.jpg)

Tuesday, November 22, 2011

Going Beyond Moore's Law

Super-Exponential Long-Term Trends in Information Technology


Interesting read for the day:
Super-exponential long-term trends in Information Technology by B. Nagy, J.D. kFarmer, J.E. Trancik, & J.P. Gonzales, shows that which Kurzeil suggested in his earlier work on "technology singularities" is true: We are experiencing faster-than-exponential growth within the information technology area.

Nagy et al. are careful to point out that their work indicates a "mathematical singularity," not to be confused with the more broadly-sweeping notion of a "technological singularity" discussed by Ray Kurzweil and others.

Kurzweil's now-famous book, The Singularity is Near: When Humans Transcent Biology, was first released as a precis on his website in approximately 2000. His interesting and detailed graphs, from which he deduced that we were going "beyond exponential growth," had data points up through approximately 2000. In contrast, Nagy et al. are able to produce data points typically through 2005.



The notion of "singularity" is both interesting and important now. Sandberg (2009) has published an interesting and readable paper, An overview of models of technological singularity".