Circuits, Bond Graphs, and Signal-Flow Diagrams

19 May, 2018

 

My student Brandon Coya finished his thesis, and successfully defended it last Tuesday!

• Brandon Coya, Circuits, Bond Graphs, and Signal-Flow Diagrams: A Categorical Perspective, Ph.D. thesis, U. C. Riverside, 2018.

It’s about networks in engineering. He uses category theory to study the diagrams engineers like to draw, and functors to understand how these diagrams are interpreted.

His thesis raises some really interesting pure mathematical questions about the category of corelations and a ‘weak bimonoid’ that can be found in this category. Weak bimonoids were invented by Pastro and Street in their study of ‘quantum categories’, a generalization of quantum groups. So, it’s fascinating to see a weak bimonoid that plays an important role in electrical engineering!

However, in what follows I’ll stick to less fancy stuff: I’ll just explain the basic idea of Brandon’s thesis, say a bit about circuits and ‘bond graphs’, and outline his main results. What follows is heavily based on the introduction of his thesis, but I’ve baezified it a little.

The basic idea

People, and especially scientists and engineers, are naturally inclined to draw diagrams and pictures when they want to better understand a problem. One example is when Feynman introduced his famous diagrams in 1949; particle physicists have been using them ever since. But some other diagrams introduced by engineers are far more important to the functioning of the modern world and its technology. It’s outrageous, but sociologically understandable, that mathematicians have figured out more about Feynman diagrams than these other kinds: circuit diagrams, bond graphs and signal-flow diagrams. This is the problem Brandon aims to fix.

I’ve been unable to track down the early history of circuit diagrams, so if you know about that please tell me! But in the 1940s, Harry Olson pointed out analogies in electrical, mechanical, thermodynamic, hydraulic, and chemical systems, which allowed circuit diagrams to be applied to a wide variety of fields. On April 24, 1959, Henry Paynter woke up and invented the diagrammatic language of bond graphs to study generalized versions of voltage and current, called ‘effort’ and ‘flow,’ which are implicit in the analogies found by Olson. Bond graphs are now widely used in engineering. On the other hand, control theorists use diagrams of a different kind, called ‘signal-flow diagrams’, to study linear open dynamical systems.

Although category theory predates some of these diagrams, it was not until the 1980s that Joyal and Street showed string digrams can be used to reason about morphisms in any symmetric monoidal category. This motivates Brandon’s first goal: viewing electrical circuits, signal-flow diagrams, and bond graphs as string diagrams for morphisms in symmetric monoidal categories.

This lets us study networks from a compositional perspective. That is, we can study a big network by describing how it is composed of smaller pieces. Treating networks as morphisms in a symmetric monoidal category lets us build larger ones from smaller ones by composing and tensoring them: this makes the compositional perspective into precise mathematics. To study a network in this way we must first define a notion of ‘input’ and ‘output’ for the network diagram. Then gluing diagrams together, so long as the outputs of one match the inputs of the other, defines the composition for a category.

Network diagrams are typically assigned data, such as the potential and current associated to a wire in an electrical circuit. Since the relation between the data tells us how a network behaves, we call this relation the ‘behavior’ of a network. The way in which we assign behavior to a network comes from first treating a network as a ‘black box’, which is a system with inputs and outputs whose internal mechanisms are unknown or ignored. A simple example is the lock on a doorknob: one can insert a key and try to turn it; it either opens the door or not, and it fulfills this function without us needing to know its inner workings. We can treat a system as a black box through the process called ‘black-boxing’, which forgets its inner workings and records only the relation it imposes between its inputs and outputs.

Since systems with inputs and outputs can be seen as morphisms in a category we expect black-boxing to be a functor out of a category of this sort. Assigning each diagram its behavior in a functorial way is formalized by functorial semantics, first introduced in Lawvere’s thesis in 1963. This consists of using categories with specific extra structure as ‘theories’ whose ‘models’ are structure-preserving functors into other such categories. We then think of the diagrams as a syntax, while the behaviors are the semantics. Thus black-boxing is actually an example of functorial semantics. This leads us to another goal: to study the functorial semantics, i.e. black-boxing functors, for electrical circuits, signal-flow diagrams, and bond graphs.

Brendan Fong and I began this type of work by showing how to describe circuits made of wires, resistors, capacitors, and inductors as morphisms in a category using ‘decorated cospans’. Jason Erbele and I, and separately Bonchi, Sobociński and Zanasi, studied signal flow diagrams as morphisms in a category. In other work Brendan Fong, Blake Pollard and I looked at Markov processes, while Blake and I studied chemical reaction networks using decorated cospans. In all of these cases, we also studied the functorial semantics of these diagram languages.

Brandon’s main tool is the framework of ‘props’, also called ‘PROPs’, introduced by Mac Lane in 1965. The acronym stands for “products and permutations”, and these operations roughly describe what a prop can do. More precisely, a prop is a strict symmetric monoidal category equipped with a distinguished object latex X$ such that every object is a tensor power X^{\otimes n}. Props arise because very often we think of a network as going between some set of input nodes and some set of output nodes, where the nodes are indistinguishable from each other. Thus we typically think of a network as simply having some natural number as an input and some natural number as an output, so that the network is actually a morphism in a prop.

Circuits and bond graphs

Now let’s take a quick tour of circuits and bond graphs. Much more detail can be found in Brandon’s thesis, but this may help you know what to picture when you hear terminology from electrical engineering.

Here is an electrical circuit made of only perfectly conductive wires:

This is just a graph, consisting of a set N of nodes, a set E of edges, and maps s,t\colon E\to N sending each edge to its source and target node. We refer to the edges as perfectly conductive wires and say that wires go between nodes. Then associated to each perfectly conductive wire in an electrical circuit is a pair of real numbers called ‘potential’, \phi, and ‘current’, I.

Typically each node gets a potential, but in the above case the potential at either end of a wire would be the same so we may as well associate the potential to the wire. Current and potential in circuits like these obey two laws due to Kirchoff. First, at any node, the sum of currents flowing into that node is equal to the sum of currents flowing out of that node. The other law states that any connected wires must have the same potential.

We say that the above circuit is closed as opposed to being open because it does not have any inputs or outputs. In order to talk about open circuits and thereby bring the ‘compositional perspective’ into play we need a notion for inputs and outputs of a circuit. We do this using two maps i\colon X\to N and o\colon Y \to N that specifiy the inputs and outputs of a circuit. Here is an example:

We call the sets X, Y, and the disjoint union X + Y the inputs, outputs, and terminals of the circuit, respectively. To each terminal we associate a potential and current. In total this gives a space of allowed potentials and currents on the terminals and we call this space the ‘behavior’ of the circuit. Since we do this association without knowing the potentials and currents inside the rest of the circuit we call this process ‘black-boxing’ the circuit. This process hides the internal workings of the circuit and just tells us the relation between inputs and outputs. In fact this association is functorial, but to understand the functoriality first requires that we say how to compose these kinds of circuits. We save this for later.

There are also electrical circuits that have ‘components’ such as resistors, inductors, voltage sources, and current sources. These are graphs as above, but with edges now labelled by elements in some set L. Here is one for example:

We call this an L-circuit. We may also black-box an L-circuit to get a space of allowed potentials and currents, i.e. the behavior of the L-circuit, and this process is functorial as well. The components in a circuit determine the possible potential and current pairs because they impose additional relationships. For example, a resistor between two nodes has a resistance R and is drawn as:

In an L-circuit this would be an edge labelled by some positive real number R. For a resistor like this Kirchhoff’s current law says I_1=I_2 and Ohm’s Law says \phi_2-\phi_1 =I_1R. This tells us how to construct the black-boxing functor that extracts the right behavior.

Engineers often work with wires that come in pairs where the current on one wire is the negative of the current on the other wire. In such a case engineers care about the difference in potential more than each individual potential. For such pairs of perfectly conductive wires:

we call V=\phi_2-\phi_1 the ‘voltage’ and I=I_1=-I_2 the ‘current’. Note the word current is used for two different, yet related concepts. We call a pair of wires like this a ‘bond’ and a pair of nodes like this a ‘port’. To summarize we say that bonds go between ports, and in a ‘bond graph’ we draw a bond as follows:

Note that engineers do not explicitly draw ports at the ends of bonds; we follow this notation and simply draw a bond as a thickened edge. Engineers who work with bond graphs often use the terms ‘effort’ and ‘flow’ instead of voltage and current. Thus a bond between two ports in a bond graph is drawn equipped with an effort and flow, rather than a voltage and current, as follows:

A bond graph consists of bonds connected together using ‘1-junctions’ and ‘0-junctions’. These two types of junctions impose equations between the efforts and flows on the attached bonds. The flows on bonds connected together with a 1-junction are all equal, while the efforts sum to zero, after sprinkling in some signs depending on how we orient the bonds. For 0-junctions it works the other way: the efforts are all equal while the flows sum to zero! The duality here is well-known to engineers but perhaps less so to mathematicians. This is one topic Brandon’s thesis explores.

Brandon explains bond graphs in more detail in Chapter 5 of his thesis, but here is an example:

The arrow at the end of a bond indicates which direction of current flow counts as positive, while the bar is called the ‘causal stroke’. These are unnecessary for Brandon’s work, so he adopts a simplified notation without the arrow or bar. In engineering it’s also important to attach general circuit components, but Brandon doesn’t consider these.

Outline

In Chapter 2 of his thesis, Brandon provides the necessary background for studying four categories as props:

• the category of finite sets and spans: \textrm{FinSpan}

• the category of finite sets and relations: \textrm{FinRel}

• the category of finite sets and cospans: \textrm{FinCospan}

• the category of finite sets and corelations: \textrm{FinCorel}.

In particular, \textrm{FinCospan} and \textrm{FinCorel} are crucial to the study of networks.

In Corollary 2.3.4 he notes that any prop has a presentation in terms of generators and equations. Then he recalls the known presentations for \textrm{FinSpan}, \textrm{FinCospan}, and \textrm{FinRel}. Proposition 2.3.7 lets us build props as quotients of other props.

He begins Chapter 3 by showing that $\mathrm{FinCorel}$ is ‘the prop for extraspecial commutative Frobenius monoids’, based on a paper he wrote with Brendan Fong. This result also gives a presentation for \mathrm{FinCorel}.

Then he defines an “L-circuit” as a graph with specified inputs and outputs, together with a labelling set for the edges of the graph. L-circuits are morphisms in the prop \textrm{Circ}_L. In Proposition 3.2.8 he uses a result of Rosebrugh, Sabadini and Walters to show that \textrm{Circ}_L can be viewed as the coproduct of \textrm{FinCospan} and the free prop on the set L of labels.

Brandon then defines \textrm{Circ} to be the prop \textrm{Circ}_L where L consists of a single element. This example is important, because \textrm{Circ} can be seen as the category whose morphisms are circuits made of only perfectly conductive wires! From any morphism in \textrm{Circ} he extracts a cospan of finite sets and then turns the cospan into a corelation. These two processes are functorial, so he gets a method for sending a circuit made of only perfectly conductive wires to a corelation:

\textrm{Circ} \stackrel{H'}{\longrightarrow} \textrm{FinCospan} \stackrel{H}{\longrightarrow} \textrm{FinCorel}

There is also a functor

K\colon \textrm{FinCorel} \to \textrm{FinRel}_k

where \textrm{FinRel}_k is the category whose objects are finite dimensional vector spaces and whose morphisms R\colon U\to V are linear relations, that is, linear subspaces R\subseteq U \oplus V. By composing with the above functors H' and H he associates a linear relation R to any circuit made of perfectly conductive wires. On the other hand he gets a subspace for any such circuit by first assigning potential and current to each terminal, and then subjecting these variables to the appropriate physical laws.

It turns out that these two ways of assigning a subspace to a morphism in \textrm{Circ} are the same. So, he calls the linear relation associated to a circuit using the composite KHH' the “behavior” of the circuit and defines the “black-boxing” functor

\blacksquare \colon \textrm{Circ}\to \textrm{FinRel}_k

to be the composite of these:

\textrm{Circ} \stackrel{H'}{\longrightarrow} \textrm{FinCospan} \stackrel{H}{\longrightarrow} \textrm{FinCorel} \stackrel{K}{\longrightarrow} \textrm{FinRel}_k

Note that the underlying corelation of a circuit made of perfectly conductive wires completely determines the behavior of the circuit via the functor K.

In Chapter 4 he reinterprets the black-boxing functor \blacksquare as a morphism of props. He does this by introducing the category \textrm{LagRel}_k, whose objects are “symplectic” vector spaces and whose morphisms are “Lagrangian” relations. In Proposition 4.1.6 he proves that the functor K\colon \textrm{FinCorel} \to \textrm{FinRel}_k actually picks out a Lagrangian relation for any corelation and thus determines a morphism of props. So, he redefines K to be this morphism

K\colon \mathrm{FinCorel} \to \mathrm{LagRel}_k

and reinterprets black-boxing as the composite

\mathrm{Circ} \stackrel{H'}{\longrightarrow} \mathrm{FinCospan} \stackrel{H}{\longrightarrow} \mathrm{FinCorel} \stackrel{K}{\longrightarrow} \mathrm{LagRel}_k

After doing al this hard work for circuits made of perfectly conductive wires—a warmup exercises that engineers might scoff at—Brandon shows the power of his results by easily extending the black-boxing functor to circuits with arbitrary label sets in Theorem 4.2.1. He applies this result to a prop whose morphisms are circuits made of resistors, inductors, and capacitors. Then he considers a more general and mathematically more natural approach to linear circuits using the prop \textrm{Circ}_k. The morphisms here are open circuits with wires labelled by elements of some chosen field k. In Theorem 4.2.4 he prove the existence of a morphism of props

\blacksquare \colon \textrm{Circ}_k \to \textrm{LagRel}_k

that describes the black-boxing of circuits built from arbitrary linear components.

Brandon then picks up where Jason Erbele’s thesis left off, and recalls how control theorists use “signal-flow diagrams” to draw linear relations. These diagrams make up the category \textrm{SigFlow}_k, which is the free prop generated by the same generators as \textrm{FinRel}_k. Similarly he defines the prop \widetilde{\mathrm{Circ}}_k as the free prop generated by the same generators as \textrm{Circ}_k. Then there is a strict symmetric monoidal functor T\colon \widetilde{\mathrm{Circ}}_k \to \textrm{SigFlow}_k giving a commutative square:

Of course, circuits made of perfectly conductive wires are a special case of linear circuits. We can express this fact using another commutative square:

Combining the diagrams so far, Brandon gets a commutative diagram summarizing the relationship between linear circuits, cospans, corelations, and signal-flow diagrams:

Brandon concludes Chapter 4 by extending his work to circuits with voltage and current sources. These types of circuits define affine relations instead of linear relations. The prop framework lets Brandon extend black-boxing to these types of circuits by showing that affine Lagrangian relations are morphisms in a prop \textrm{AffLagRel}_k. This leads to Theorem 4.4.5, which says that for any field k and label set L there is a unique morphism of props

\blacksquare \colon \textrm{Circ}_L \to \textrm{AffLagRel}_k

extending the other black-boxing functor and sending each element of L to an arbitrarily chosen affine Lagrangian relation between potentials and currents.

In Chapter 5, Brandon studies bond graphs as morphisms in a category. His goal is to define a category \textrm{BondGraph}, whose morphisms are bond graphs, and then assign a space of efforts and flows as behavior to any bond graph using a functor. He also constructs a functor that assigns a space of potentials and currents to any bond graph, which agrees with the way that potential and current relate to effort and flow.

The subtle way he defines \textrm{BondGraph} comes from two different approaches to studying bond graphs, and the problems inherent in each approach. The first approach leads him to a subcategory \textrm{FinCorel}^\circ of \textrm{FinCorel}, while the second leads him to a subcategory \textrm{LagRel}_k^\circ of \textrm{LagRel}_k. There isn’t a commutative square relating these four categories, but Brandon obtains a pentagon that commutes up to a natural transformation by inventing a new category \textrm{BondGraph}:

This category is a way of formalizing Paynter’s idea of bond graphs.

In his first approach, Brandon views a bond graph as an electrical circuit. He takes advantage of his earlier work on circuits and corelations by taking \textrm{FinCorel} to be the category whose morphisms are circuits made of perfectly conductive wires. In this approach a terminal is the object 1 and a wire is the identity corelation from 1 to 1, while a circuit from m terminals to n terminals is a corelation from m to n.

In this approach Brandon thinks of a port as the object 2, since a port is a pair of nodes. Then he thinks of a bond as a pair of wires and hence the identity corelation from 2 to 2. Lastly, the two junctions are two different ways of connecting ports together, and thus specific corelations from 2m to 2n. It turns out that by following these ideas he can equip the object 2 with two different Frobenius monoid structures, which behave very much like 1-junctions and 0-junctions in bond graphs!

It would be great if the morphisms built from these two Frobenius monoids corresponded perfectly to bond graphs. Unfortunately there are some equations which hold between morphisms made from these Frobenius monoids that do not hold for corresponding bond graphs. So, Brandon defines a category \textrm{FinCorel}^\circ using the morphisms that come from these two Frobenius monoids and moves on to a second attempt at defining \textrm{BondGraph}.

Since bond graphs impose Lagrangian relations between effort and flow, this second approach starts by looking back at \textrm{LagRel}_k. The relations associated to a 1-junction make k\oplus k into yet another Frobenius monoid, while the relations associated to a 0-junction make k\oplus k into a different Frobenius monoid. These two Frobenius monoid structures interact to form a bimonoid! Unfortunately, a bimonoid has some equations between morphisms that do not correspond to equations between bond graphs, so this approach also does not result in morphisms that are bond graphs. Nonetheless, Brandon defines a category \textrm{LagRel}_k^\circ using the two Frobenius monoid structures k\oplus k.

Since it turns out that \textrm{FinCorel}^\circ and \textrm{LagRel}_k^\circ have corresponding generators, Brandon defines \textrm{BondGraph} as a prop that also has corresponding generators, but with only the equations found in both \textrm{FinCorel}^\circ and \textrm{LagRel}_k^\circ. By defining \textrm{BondGraph} in this way he automatically gets two functors

F\colon \textrm{BondGraph} \to \textrm{LagRel}_k^\circ

and

G\colon \textrm{BondGraph} \to \textrm{FinCorel}^\circ

The functor F associates effort and flow to a bond graph, while the functor G lets us associate potential and current to a bond graph using the previous work done on \textrm{FinCorel}. Then the Lagrangian subspace relating effort, flow, potential, and current:

\{(V,I,\phi_1,I_1,\phi_2,I_2) | V = \phi_2-\phi_1, I = I_1 = -I_2\}

defines a natural transformation in the following diagram:

Putting this together with the diagram we saw before, Brandon gets a giant diagram which encompasses the relationships between circuits, signal-flow diagrams, bond graphs, and their behaviors in category theoretic terms:

This diagram is a nice quick road map of his thesis. Of course, you need to understand all the categories in this diagram, all the functors, and also their applications to engineering, to fully appreciate what he has accomplished! But his thesis explains that.

To learn more

Coya’s thesis has lots of references, but if you want to see diagrams at work in actual engineering, here are some good textbooks on bond graphs:

• D. C. Karnopp, D. L. Margolis and R. C. Rosenberg, System Dynamics: A Unified Approach, Wiley, New York, 1990.

• F. T. Brown, Engineering System Dynamics: A Unified Graph-Centered Approach, Taylor and Francis, New York, 2007.

and here’s a good one on signal-flow diagrams:

• B. Friedland, Control System Design: An Introduction to State-Space Methods, S. W. Director (ed.), McGraw–Hill Higher Education, 1985.


Saving Climate Data (Part 5)

6 February, 2017

march-for-science-earth-day

There’s a lot going on! Here’s a news roundup. I will separately talk about what the Azimuth Climate Data Backup Project is doing.

I’ll start with the bad news, and then go on to some good news.

Tweaking the EPA website

Scientists are keeping track of how Trump administration is changing the Environmental Protection Agency website, with before-and-after photos, and analysis:

• Brian Kahn, Behold the “tweaks” Trump has made to the EPA website (so far), National Resources Defense Council blog, 3 February 2017.

There’s more about “adaptation” to climate change, and less about how it’s caused by carbon emissions.

All of this would be nothing compared to the new bill to eliminate the EPA, or Myron Ebell’s plan to fire most of the people working there:

• Joe Davidson, Trump transition leader’s goal is two-thirds cut in EPA employees, Washington Post, 30 January 2017.

If you want to keep track of this battle, I recommend getting a 30-day free subscription to this online magazine:

InsideEPA.com.

Taking animal welfare data offline

The Trump team is taking animal-welfare data offline. The US Department of Agriculture will no longer make lab inspection results and violations publicly available, citing privacy concerns:

• Sara Reardon, US government takes animal-welfare data offline, Nature Breaking News, 3 Feburary 2017.

Restricting access to geospatial data

A new bill would prevent the US government from providing access to geospatial data if it helps people understand housing discrimination. It goes like this:

Notwithstanding any other provision of law, no Federal funds may be used to design, build, maintain, utilize, or provide access to a Federal database of geospatial information on community racial disparities or disparities in access to affordable housing._

For more on this bill, and the important ways in which such data has been used, see:

• Abraham Gutman, Scott Burris, and the Temple University Center for Public Health Law Research, Where will data take the Trump administration on housing?, Philly.com, 1 February 2017.

The EDGI fights back

The Environmental Data and Governance Initiative or EDGI is working to archive public environmental data. They’re helping coordinate data rescue events. You can attend one and have fun eating pizza with cool people while saving data:

• 3 February 2017, Portland
• 4 February 2017, New York City
• 10-11 February 2017, Austin Texas
• 11 February 2017, U. C. Berkeley, California
• 18 February 2017, MIT, Cambridge Massachusetts
• 18 February 2017, Haverford Connecticut
• 18-19 February 2017, Washington DC
• 26 February 2017, Twin Cities, Minnesota

Or, work with EDGI to organize one your own data rescue event! They provide some online tools to help download data.

I know there will also be another event at UCLA, so the above list is not complete, and it will probably change and grow over time. Keep up-to-date at their site:

Environmental Data and Governance Initiative.

Scientists fight back

The pushback is so big it’s hard to list it all! For now I’ll just quote some of this article:

• Tabitha Powledge, The gag reflex: Trump info shutdowns at US science agencies, especially EPA, 27 January 2017.

THE PUSHBACK FROM SCIENCE HAS BEGUN

Predictably, counter-tweets claiming to come from rebellious employees at the EPA, the Forest Service, the USDA, and NASA sprang up immediately. At The Verge, Rich McCormick says there’s reason to believe these claims may be genuine, although none has yet been verified. A lovely head on this post: “On the internet, nobody knows if you’re a National Park.”

At Hit&Run, Ronald Bailey provides handles for several of these alt tweet streams, which he calls “the revolt of the permanent government.” (That’s a compliment.)

Bailey argues, “with exception perhaps of some minor amount of national security intelligence, there is no good reason that any information, data, studies, and reports that federal agencies produce should be kept from the public and press. In any case, I will be following the Alt_Bureaucracy feeds for a while.”

NeuroDojo Zen Faulkes posted on how to demand that scientific societies show some backbone. “Ask yourself: “Have my professional societies done anything more political than say, ‘Please don’t cut funding?’” Will they fight?,” he asked.

Scientists associated with the group_ 500 Women Scientists _donned lab coats and marched in DC as part of the Women’s March on Washington the day after Trump’s Inauguration, Robinson Meyer reported at the Atlantic. A wildlife ecologist from North Carolina told Meyer, “I just can’t believe we’re having to yell, ‘Science is real.’”

Taking a cue from how the Women’s March did its social media organizing, other scientists who want to set up a Washington march of their own have put together a closed Facebook group that claims more than 600,000 members, Kate Sheridan writes at STAT.

The #ScienceMarch Twitter feed says a date for the march will be posted in a few days. [The march will be on 22 April 2017.] The group also plans to release tools to help people interested in local marches coordinate their efforts and avoid duplication.

At The Atlantic, Ed Yong describes the political action committee 314Action. (314=the first three digits of pi.)

Among other political activities, it is holding a webinar on Pi Day—March 14—to explain to scientists how to run for office. Yong calls 314Action the science version of Emily’s List, which helps pro-choice candidates run for office. 314Action says it is ready to connect potential candidate scientists with mentors—and donors.

Other groups may be willing to step in when government agencies wimp out. A few days before the Inauguration, the Centers for Disease Control and Prevention abruptly and with no explanation cancelled a 3-day meeting on the health effects of climate change scheduled for February. Scientists told Ars Technica’s Beth Mole that CDC has a history of running away from politicized issues.

One of the conference organizers from the American Public Health Association was quoted as saying nobody told the organizers to cancel.

I believe it. Just one more example of the chilling effect on global warming. In politics, once the Dear Leader’s wishes are known, some hirelings will rush to gratify them without being asked.

The APHA guy said they simply wanted to head off a potential last-minute cancellation. Yeah, I guess an anticipatory pre-cancellation would do that.

But then—Al Gore to the rescue! He is joining with a number of health groups—including the American Public Health Association—to hold a one-day meeting on the topic Feb 16 at the Carter Center in Atlanta, CDC’s home base. Vox’s Julia Belluz reports that it is not clear whether CDC officials will be part of the Gore rescue event.

The Sierra Club fights back

The Sierra Club, of which I’m a proud member, is using the Freedom of Information Act or FOIA to battle or at least slow the deletion of government databases. They wisely started even before Trump took power:

• Jennifer A Dlouhy, Fearing Trump data purge, environmentalists push to get records, BloombergMarkets, 13 January 2017.

Here’s how the strategy works:

U.S. government scientists frantically copying climate data they fear will disappear under the Trump administration may get extra time to safeguard the information, courtesy of a novel legal bid by the Sierra Club.

The environmental group is turning to open records requests to protect the resources and keep them from being deleted or made inaccessible, beginning with information housed at the Environmental Protection Agency and the Department of Energy. On Thursday [January 9th], the organization filed Freedom of Information Act requests asking those agencies to turn over a slew of records, including data on greenhouse gas emissions, traditional air pollution and power plants.

The rationale is simple: Federal laws and regulations generally block government agencies from destroying files that are being considered for release. Even if the Sierra Club’s FOIA requests are later rejected, the record-seeking alone could prevent files from being zapped quickly. And if the records are released, they could be stored independently on non-government computer servers, accessible even if other versions go offline.


The Stochastic Resonance Program (Part 2)

28 August, 2014

guest post by David Tanzer

Last time we introduced the concept of stochastic resonance. Briefly, it’s a way that noise can amplify a signal, by giving an extra nudge that helps a system receiving that signal make the jump from one state to another. Today we’ll describe a program that demonstrates this concept. But first, check it out:

Stochastic resonance.

No installation required! It runs as a web page which allows you to set the parameters of the model and observe the resulting output signal. It has a responsive behavior, because it runs right in your browser, as javascript.

There are sliders for controlling the amounts of sine wave and noise involved in the mix. As explained in the previous article, when we set the wave to a level not quite sufficient to cause the system to oscillate between states, and we add in the right amount of noise, stochastic resonance should kick in:


The program implements a mathematical model that runs in discrete time. It has two stable states, and is driven by a combination of a sine forcing function and a noise source.

The code builds on top of a library called JSXGraph, which supports function plotting, interactive graphics, and data visualization.

Running the program

If you haven’t already, go try the program. On one plot it shows a sine wave, called the forcing signal, and a chaotic time-series, called the output signal.

There are four sliders, which we’ll call Amplitude, Frequency, Noise and Sample-Path.

• The Amplitude and Frequency sliders control the sine wave. Try them out.

• The output signal depends, in a complex way, on the sine wave. Vary Amplitude and Frequency to see how they affect the output signal.

• The amount of randomization involved in the process is controlled by the Noise slider. Verify this.

• Change the Sample-Path slider to alter the sequence of random numbers that are fed to the process. This will cause a different instance of the process to be displayed.

Now try to get stochastic resonance to kick in…

Going to the source

Time to look at the blueprints. It’s easy.

• Open the model web page. The code is now running in your browser.

• While there, run your browser’s view-source function. For Firefox on the Mac, click Apple-U. For Firefox on the PC, click Ctrl-U.

• You should see the html file for the web page itself.

• See the “script” directives at the head of this file. Each one refers to javascript program on the internet. When the browser sees it, the program is fetched and loaded into the browser’s internal javascript interpreter. Here are the directives:

http://cdn.mathjax.org/mathjax/latest/MathJax.js?config=default

http://cdnjs.cloudflare.com/ajax/libs/jsxgraph/0.93/jsxgraphcore.js

http://./StochasticResonanceEuler.js

http://./normals.js

The first one loads MathJax, which is a formula-rendering engine. Next comes JSXGraph, a library that provides support for plotting and interactive graphics. Next, StochchasticResonanceEuler.js is the main code for the model, and finally, normals.js provides random numbers.

• In the source window, click on the link for StochasticResonanceEuler.js — and you’ve reached the source!

Anatomy of the program

The program implements a stochastic difference equation, which defines the changes in the output signal as a function of its current value and a random noise value.

It consists of the following components:

  1. Interactive controls to set parameters
  2. Plot of the forcing signal

  3. Plot of the output signal

  4. A function that defines a particular SDE

  5. A simulation loop, which renders the output signal.

The program contains seven functions. The top-level function is initCharts. It dispatches to initControls, which builds the sliders, and initSrBoard, which builds the curve objects for the forcing function and the output signal (called “position curve” in the program). Each curve object is assigned a function that computes the (x,t) values for the time series, which gets called whenever the input parameters change. The function that is assigned to the forcing curve computes the sine wave, and reads the amplitude and frequency values from the sliders.

The calculation method for the output signal is set to the function mkSrPlot, which performs the simulation. It begins by defining a function for the deterministic part of the derivative:

deriv = Deriv(t,x) = SineCurve(t) + BiStable(x),

Then it constructs a “stepper” function, through the call Euler(deriv, tStep). A stepper function maps the current point (t,x) and a noise sample to the next point (t’,x’). The Euler stepper maps

((t,x), noiseSample)

to

(t + tStep, x + tStep * Deriv(t,x) + noiseSample).

The simulation loop is then performed by the function sdeLoop, which is given:

• The stepper function

• The noise amplitude (“dither”)

• The initial point (t0,x0)

• A randomization offset

• The number of points to generate

The current point is initialized to (t0,x0), and then the stepper is repeatedly applied to the current point and the current noise sample. The output returned is the sequence of (t,x) values.

The noise samples are normally distributed random numbers stored in an array. They get scaled by the noise amplitude when they are used. The array contains more values than are needed. By changing the starting point in the array, different instances of the process are obtained.

Making your own version of the program

Now let’s tweak the program to do new things.

First let’s make a local copy of the program on your local machine, and get it to run there. Make a directory, say /Users/macbookpro/stochres. Open the html file in the view source window. Paste it into the file /Users/macbookpro/stochres/stochres.html. Next, in the view source window, click on the link to StochasticResonanceEuler.js. Paste the text into /Users/macbookpro/stochres/StochasticResonanceEuler.js.

Now point your browser to the file, with the URL file:///Users/macbookpro/stochres/stochres.html. To prove that you’re really executing the local copy, make a minor edit to the html text, and check that it shows up when you reload the page. Then make a minor edit to StochasticResonanceEuler.js, say by changing the label text on the slider from “forcing function” to “forcing signal.”

Programming exercises

Now let’s get warmed up with some bite-sized programming exercises.

  1. Change the color of the sine wave.
  • Change the exponent in the bistable polynomial to values other than 2, to see how this affects the output.

  • Add an integer-valued slider to control this exponent.

  • Modify the program to perform two runs of the process, and show the output signals in different colors.

  • Modify it to perform ten runs, and change the output signal to display the point-wise average of these ten runs.

  • Add an input slider to control the number of runs.

  • Add another plot, which shows the standard deviation of the output signals, at each point in time.

  • Replace the precomputed array of normally distributed random numbers with a run-time computation that uses a random number generator. Use the Sample-Path slider to seed the random number generator.

  • When the sliders are moved, explain the flow of events that causes the recalculation to take place.

  • A small research project

    What is the impact of the frequency of the forcing signal on its transmission through stochastic resonance?

    • Make a hypothesis about the relationship.

    • Check your hypothesis by varying the Frequency slider.

    • Write a function to measure the strength of the output signal at the forcing frequency. Let sinwave be a discretely sampled sine wave at the forcing frequency, and coswave be a discretely sampled cosine wave. Let sindot = the dot product of sinwave and the output signal, and similarly for cosdot. Then the power measure is sindot2 + cosdot2.

    • Modify the program to perform N trials at each frequency over some specified range of frequency, and measure the average power over all the N trials. Plot the power as a function of frequency.

    • The above plot required you to fix a wave amplitude and noise level. Choose five different noise levels, and plot the five curves in one figure. Choose your noise levels in order to explore the range of qualitative behaviors.

    • Produce several versions of this five-curve plot, one for each sine amplitude. Again, choose your amplitudes in order to explore the range of qualitative behaviors.