As you may know, there’s a wonderful and famous analogy between classical mechanics and electrical circuit theory. I explained it back in “week288”, so I won’t repeat that story now. If you don’t know what I’m talking about, take a look!

This analogy opens up the possibility of quantizing electrical circuits by straightforwardly copying the way we quantize classical mechanics problems. I’d often wondered if this would be useful.

Michel Devoret, Rob Schoelkopf and others call this idea quantronics: the study of mesoscopic electronic effects in which collective degrees of freedom like currents and voltages behave quantum mechanically.

I just learned about this from a talk by Sean Barrett here in Coogee. There are lots of cool applications, but right now I’m mainly interested in how this extends the set of analogies between different physical theories.

One interesting thing is how they quantize circuits with resistors. Over in classical mechanics, this corresponds to systems with friction. These systems, called ‘dissipative’ systems, don’t have a conserved energy. More precisely, energy leaks out of the system under consideration and gets transferred to the environment in the form of heat. It’s hard to quantize systems where energy isn’t conserved, so people in quantronics model resistors as infinite chains of inductors and capacitors: see the ‘LC ladder circuit’ on page 15 of Devoret’s notes. This idea is also the basis of the Caldeira–Leggett model of a particle coupled to a heat bath made of harmonic oscillators: it amounts to including the environment as part of the system being studied.

Dissipative systems require a somehow different Lagrangian approach. Moreover, Jeltsema‘s papers, like this one, deserve your attention in case you don’t know them:

Related stuff has to do with considering the whole physical realms as a system and the physical laws as “the circuit software”. Great post once again John! :)

About the Quantronics stuff: I didn’t not know it either! Very interesting. According to the memory device circuitry proposed by Chua, the working memristor can reach some kind of quantization, according to the physical requirements of memristance function (see the Wikipedia article on memristors). Aren’t you feeling that it seems there exists some big picture relating all the stuff: Mechanics, Thermodynamics, Electronics, Lagrangians, Maxwell equations/relations,…Gravity too? What about it? :D

I enjoyed this very readable review by Christopher Jarzynski, “Equalities and Inequalities: Irreversibility and the Second Law of Thermodynamics at the Nanoscale”. It covers a lot of the fun stuff seen around these parts lately, including modeling dissipative systems.

Thanks, that looks like a fascinating paper! Let me quote the abstract so everyone can see what I mean:

Abstract. The general Lagrange-Euler formalism for the three memory circuit elements, namely, memristive, memcapacitive, and meminductive systems, is introduced. In addition, mutual meminductance, i.e. mutual inductance with a state depending on the past evolution of the system, is defined. The Lagrange-Euler formalism for a general circuit network, the related work-energy theorem, and the generalized Joule’s first law are also obtained. Examples of this formalism applied to specific circuits are provided, and the corresponding Hamiltonian and its quantization for the case of non-dissipative elements are discussed. The notion of memory quanta, the quantum excitations of the memory degrees of freedom, is presented. Specific examples are used to show that the coupling between these quanta and the well-known charge quanta can lead to a splitting of degenerate levels and to other experimentally observable quantum effects.

I still don’t understand memristors very well, but the idea of a ‘memory quantum’ sounds cool! I wish I could buy a few!

Dissipative systems require a somehow different Lagrangian approach.

I really really wish I understood a good variational approach to dissipative systems—something they maximize or minimize. Do you understand this? I’m hoping that by a ‘Lagrangian approach’ you mean deriving Euler-Lagrange-like equations from some variational principle.

Aren’t you feeling that it seems there exists some big picture relating all the stuff: Mechanics, Thermodynamics, Electronics, Lagrangians, Maxwell equations/relations,…

Yes! I’m trying to figure it out and explain it as I go along.

Gravity too? What about it? :D

That’s one of the few things I’m not very interested in these days. I spent a long time thinking about general relativity. Right now I’m having more fun thinking about ‘practical’ or ‘applied’ subjects—from a mathematical perspective.

I don’t know if you’ve read my posts starting with “week288” and going until “week297”. This laid out a detailed correspondence between different subjects:

displacement flow momentum effort
q dq/dt p dp/dt
Mechanics position velocity momentum force
(translation)
Mechanics angle angular angular torque
(rotation) velocity momentum
Electronics charge current flux voltage
linkage
Hydraulics volume flow pressure pressure
momentum
Thermodynamics entropy entropy temperature temperature
flow momentum
Chemistry moles molar chemical chemical
flow momentum potential

Lately I’ve been working out how ideas like Hamilton’s equations and the symplectic structure generalize from mechanics to thermodynamics using this correspondence… and also how quantum mechanics is analogous to thermodynamics with ‘quantropy’ replacing entropy. So I think it’s all fitting together better and better, and I just hope a few people have been listening and agree that it’s all very nice and simple. If not, I may be forced to write a book! Just to punish the world for not paying attention to me.

By the way, optics fits in here too. Sean Barrett pointed out this comprehensive, free quantum optics textbook:

I have to admit that I’ve not been paying attention to the quantropy thread, but this table is very helpful and very interesting! I’d heard some of these analogies before (e.g., that voltage is analogous to pressure), but I’ve never seen the analogies extended to include thermodynamics (ah, so that’s what temperature is!), and more importantly, I hadn’t been minding my p’s and q’s! I may go back now and see if I can read some of this quantropy stuff.

Sorry for a silly comment. Just a note of appreciation. :-)

Hi Todd – great to hear from you! If you like those analogies (and how can anyone resist: it’s like having a secret pocket map of many branches of science!), you can read more about them at week290. Indeed one reads about them here and there, especially the hydraulic analogy, which Heaviside derisively called the ‘drain-pipe theory’ of electrical circuits—and perhaps even more these days, the analogy between electrical circuits and systems of masses and springs. But the people who really develop them seriously are the engineers who work on systems built from a mix of electrical, mechanical, hydraulic, chemical and thermodynamic
components. They need a unified language!

These engineers tend to emphasize that has dimensions of power, and they consider this equation to be very important:

It’s called the power balance equation and it’s a kind of equation of ‘nonconservation of energy’: it tells you the rate at which energy is leaving our system. is, as usual, the energy or ‘Hamiltonian’ as a function of the momentum and position variables. So if you’ve seen Hamilton’s equations, you’ll instantly notice that the power balance equation looks like some strange parody of those. Hamilton’s equations, after all, imply that energy is conserved:

So what the heck is going on???

Briefly, I think the engineers have generalized Hamiltonian mechanics to ‘open systems’: systems that interact with the outside world, like a toaster with that you can plug into a power socket, or a stereo amplifier with places in the back where you can insert jacks for signals coming and going from your speaker, CD player and so on. For open systems energy is not conserved because it can enter and leave. You can draw these systems using pictures like electrical circuits, which category theorists like you call ‘string diagrams’.

So, I realized that open systems of the sort described by that table are actually morphisms in some symmetric monoidal category, and I decided to study that category. I decided to start by considering electrical circuits made only of resistors, and I showed that this particular category is very nicely described in terms of ‘Dirichlet forms’ and ‘Lagrangian subspaces of symplectic vector space’. I said how in “week297”: that was the climax of a certain story I’d been telling in This Week’s Finds.

I’ve mostly finished a paper on that… but then, instead of marching on straightforwardly and adding other circuit elements (capacitors, inductors, etc.) to the mix, I got distracted and started studying stochastic Petri nets and Markov processes. But then it turned out they’re deeply related! I explained that in part 16 of my network theory posts. Briefly, if you’ve got an electrical circuit made of resistors, the electrical potential will in equilibrium obey an equation

similar to Laplace’s equation

but you can use the same operator (called a ‘Dirichlet operator’, just another way of thinking about a Dirichlet form) to set up an equation

similar to the heat equation

and this latter equation describes a Markov process, just as the heat equation describes Brownian motion. This is in fact well-known (at least by those who know it well).

Anyway, the newer stuff I’ve been doing on symplectic aspects of thermodynamics, and quantropy, are also parts of this story. But just sketching it here has made it clear how sprawling and disorganized it is. I need to explain it in a more well-organized way! I plan to write a bunch of papers on this once I get back from Singapore and get a new batch of grad students—students help me stay focused on one thing at a time, and provide a good excuse for explaining stuff.

Maybe the next year 2013 or at the end to this year 2012 you could really buy “a quantum memory”. Memristors are coming to the real world soon. At least, if the schedule is accomplished, Hynix and HP will be releasing memristive memories/memristors to the market. I read that they had already built transistor-memristor hybrid stuff. If they succeed, and I think so, a new revolution in memories are coming to replace the old-fashioned transistor.

A priori, a memristor remembers the last state, so, it implies we will have not to reboot our computers, if the memristor implementation is correct, turning off a complete OS will be as fast as turning on and off a light bulb! ( Chua dixit). See also this great paper: http://www.ece.tamu.edu/~huang/files/materials606/circuit.pdf

Chua promised a 2nd part, but, in a forum interaction, he told that due to the little feedback and the increasing duties he had acquired because of the HP discovery, he has not been able to write the sequel to that pretty work above.

Yes, I have read the posts you comment, and all the stuff you mention. I follow you since your early blogger times (I mean your TWF), and we have met before in the cyberspace. You have been one of the sources of inspiration for my upcoming blog and microblog on Physics, Mathematics and Science.

About the dissipative Lagrangian-like dynamics. I agree with you and you are right about I was meaning. I tend to think the dissipative lagrangian formalism could be related to an enlarged multilagrangian-like framework, likely using Nambu Mechanics! Nambu Mechanics has been recently succesfully applied to atmospheric dynamics, and, in case you don’t know, it has been useful in a reformulation of Maxwell equations as a Nambu-field system, here: arxiv.org/pdf/1011.5282

About the book, it could be a great project. Don’t neglect the idea if your visitors don’t increase exponentially! :P

“…if the memristor implementation is correct, turning off a complete OS will be as fast as turning on and off a light bulb!”

(I assume you mean turning on a complete OS, since turning off can typically be done, destructively, in about one second by pulling the plug.) The main problem with starting up computers is not due to the use of volatile memory technology. The problem is that economics drives OS manufacturers to create systems with bugs (especially in the presence of hardware failures, even those that are common or minor in nature). The existence of bugs makes for a low Mean Time Before Failure, which makes it necessary to “bootstrap” an OS from its ‘first principles’, initializing and rebuilding its complicated data structures over a time that can be measured in seconds or minutes. Fast startup (such as from Hibernate or Sleep power modes) will always require an occasional much slower Full Bootstrap startup procedure until software engineers are encouraged to use bug-lowering technologies (like “smart resource pointers” or large-scale program proving). Sorry if this is off-topic.

Thanks, David—that’s not off-topic as far as I’m concerned, it’s very interesting. I’ve always wondered why it took so darn long for a computer to do a full startup, and I’ve certainly noticed how when I keep ‘hibernating’ my computer it keeps working more and more poorly, sort of like my brain does when I keep failing to get a full night’s sleep.

Yes, I was meaning turning on! A typos, I am sorry, could you edit it John? David, you are right. But I think that as far as Technology evolves or whenever it has the next breakdown/leap, the starting up will increase its speed. There are already some computers implementing fast boot systems, so, I think not very differently to you. Bugs or correcting error codes will be more efficient and better, unless of course you build or write something like Windows Vista! lol About why bug-lowering technologies are not popular…I had not realized it what important it is! And since J.C.B. is now worried about how to save the world, we should have asked that question too! Maybe a new case of Technological Obsolescence? Is there some interest in that consumers have those inefficient a buggy systems then? What about their energy/power efficiency? I presume buggy systems consume more power than a low-buggy systems or they dissipate more heat into atmosphere, is it correct?I am not sure. I think as John, it is not off-topic. It is a very interesting and parallel topic ot all this stuff related to flux, entropy(information),electronics or hydraulics. Specially, since we are doomed to invest in technologies saving energy and to decrease those which are driving the global warming.

In this context the following will be interesting: almost all physical quantities are ranked into 7 quantity tables, in which every cell is identified by a plankunit. There are only 4 pure electric quantities: electric charge, magnetic flux, and the ratios of those two: electric admittance ( = electric charge/magnetic flux) and electric impedance ( = magnetic flux/electric charge). The other quantities like electric vector potential, magnetic flux density, electric capacitance, magnetic permeability, magnetic moment e.d. are all mixed quantities: combinations of electric charge or magnetic flux together with length or time.

your readers can be very interested as I am! It is so cool. Unfortunately, I can not find a Ph.D position on that topic. I would be interested. My search continues… :(

That’s a great talk! Everyone interested in analogies between different physical theories should look at it. Among other things, it describes a mechanical analogue of a memristor on page 11, and a mechanical analogue of a meminductor on page 20. It also discusses these gadgets using the theory of ‘port-Hamiltonian systems’, meaning Hamiltonian mechanics for open systems.

It should be possible for qualified people to PhD’s on this subject in department of engineering or applied mathematics. For example, Jeltsema is in a department of applied mathematics, and Jan Willems, another good researcher in this area, is in a department of electrical engineering.

I’m going to try to make these into subjects that ordinary ‘unapplied’ math grad students can work on, too.

By ‘this subject’ I meant open Hamiltonian systems, as studied by engineers under the name of ‘port-Hamiltonian systems’. It’s possible that Dirac manifolds and even Courant algebroids are relevant, but I haven’t seem people doing the things I have in mind using those. And I don’t want to publicly say what I have in mind until I write a paper about it, or at least some blog articles!

By the way, we should plan our your visit here. I’ve been meaning to email you about this! My summer calendar is gradually getting filled up: for example, I’ll be in Barcelona at a conference on the mathematics of biodiversity from June 18 to July 6, and today I discovered I may be spending a couple of weeks in Europe before that as well—it’s my big annual journey to the west. Before this interval, and after this interval until mid-September, I should be in Singapore most of the time… though I’ll probably nip down to Java for a bit. John Huerta and Jamie Vicary will be visiting me at points, but that shouldn’t be a problem.

The Dirac structures are the standard framework for port-Hamiltonian (open Hamiltonian not necessarily holonomic) systems. It seems to be well understood by the practitioners. Courant algebroids also show up, but mostly as trivial bundles over manifolds with Dirac structures. I have a student who tried to sort through this last summer.

We eventually discovered that it’s hard to think of port-Hamiltonian systems without having a background in standard (closed, holonomic) Hamiltonian systems. So we are trying to remedy this now.

I’m going to try to make these into subjects that ordinary ‘unapplied’ math grad students can work on, too.

This subject mostly exists under such names as “Courant algebroids” and “Dirac manifolds.” I wonder if I am misunderstanding / misinterpreting what you just said and you mean something else?

I am a theoretical physicist with a Master on Sc. in theoretical physics too from Spain. I am sending my CV to different places around the world. My problem are perhaps my career grades and my age (33,almost 34), this is why I am searching for “something” around the world.

Yesterday, I applied to an offer in plasma physics and other in “extra solar planets” detection. LOL. But, I love Mathematics (Indeed I love algebraic stuff too like division algebras, n-ary algebras, Nambu mechanics or clifford algebras) and Physics mainly.

Do you know some places where I could see specific offers? I use to watch brightrecruit, find a phD, tiptop , etc.

Do you know people working on Nambu mechanics?I recommend you to see the last news related to the rôle of the enstrophy functional. I think to remember Terence Tao has submitted a temptative proof of the Navier-Stokes Clay problem in which he uses “enstrophy” (not confuse with entropy) stuff. 92 pages and very technical, too long for me with that topic but your math-inclined readers could be interested. http://arxiv.org/abs/1108.1165

I don’t have much help to offer, but you can get an idea who is working on Nambu mechanics (or any other topic in math or physics) by doing an arXiv search.

Thank you John. It is a pity that my country is not investing in Science, or R+D, a national tragedy! I will see if I can find something out there, somehow. My job as acting High-School teacher is so boring in spite the passion I put on it. Hahaha. Anyway, things are not very different out there excepting the fact truly developed countries invest where future IS.

About dissipative systems in lagrangians (variational principles including dissipation), you should read about the work of Riewe and fractional calculus plus the stuff related to the Rayleigh function. It is a pity those works are not generally in the curriculum of many physicists! However, I agree with you in that the role of dissipation in classical dynamics is not understood yet well enough. There are some recent developments as well in dissipative Nambu mechanics pretty cool, but as far as I known, dissipation (energy loss due to friction) is really THE QUESTION to be understood in order to get a better understanding of Dark Matter and Dark Energy…Provided the vacuum is something like a medium where the Higgs field live and where friction of vacuum can be important too!

You can use Markdown or HTML in your comments. You can also use LaTeX, like this: $latex E = m c^2 $. The word 'latex' comes right after the first dollar sign, with a space after it. Cancel reply

You need the word 'latex' right after the first dollar sign, and it needs a space after it. Double dollar signs don't work, and other limitations apply, some described here. You can't preview comments here, but I'm happy to fix errors.

Hello, John. You should also take a look at this paper and references therein:

• G. Z. Cohen, Y. V. Pershin and M. Di Ventra, Lagrange formalism of memory circuit elements: classical and quantum formulations.

Dissipative systems require a somehow different Lagrangian approach. Moreover, Jeltsema‘s papers, like this one, deserve your attention in case you don’t know them:

• Dimitri Jeltsema, Multidomain Modeling of nonlinear networks and systems: energy and power based Perspectives.

Related stuff has to do with considering the whole physical realms as a system and the physical laws as “the circuit software”. Great post once again John! :)

About the Quantronics stuff: I didn’t not know it either! Very interesting. According to the memory device circuitry proposed by Chua, the working memristor can reach some kind of quantization, according to the physical requirements of memristance function (see the Wikipedia article on memristors). Aren’t you feeling that it seems there exists some big picture relating all the stuff: Mechanics, Thermodynamics, Electronics, Lagrangians, Maxwell equations/relations,…Gravity too? What about it? :D

I enjoyed this very readable review by Christopher Jarzynski, “Equalities and Inequalities: Irreversibility and the Second Law of Thermodynamics at the Nanoscale”. It covers a lot of the fun stuff seen around these parts lately, including modeling dissipative systems.

Thanks, I’ll check it out!

Amarashiki wrote:

Thanks, that looks like a fascinating paper! Let me quote the abstract so everyone can see what I mean:

I still don’t understand memristors very well, but the idea of a ‘memory quantum’ sounds cool! I wish I could buy a few!

I

really really wishI understood a good variational approach to dissipative systems—something they maximize or minimize. Do you understand this? I’m hoping that by a ‘Lagrangian approach’ you mean deriving Euler-Lagrange-like equations from some variational principle.Yes! I’m trying to figure it out and explain it as I go along.

That’s one of the few things I’m not very interested in these days. I spent a long time thinking about general relativity. Right now I’m having more fun thinking about ‘practical’ or ‘applied’ subjects—from a mathematical perspective.

I don’t know if you’ve read my posts starting with “week288” and going until “week297”. This laid out a detailed correspondence between different subjects:

Lately I’ve been working out how ideas like Hamilton’s equations and the symplectic structure generalize from mechanics to thermodynamics using this correspondence… and also how quantum mechanics is analogous to thermodynamics with ‘quantropy’ replacing entropy. So I think it’s all fitting together better and better, and I just hope a few people have been listening and agree that it’s all very nice and simple. If not, I may be forced to write a book! Just to punish the world for not paying attention to me.

By the way, optics fits in here too. Sean Barrett pointed out this comprehensive, free quantum optics textbook:

• Dan Steck,

Quantum and Atom Optics.More stuff to read!

I have to admit that I’ve not been paying attention to the quantropy thread, but this table is very helpful and very interesting! I’d heard some of these analogies before (e.g., that voltage is analogous to pressure), but I’ve never seen the analogies extended to include thermodynamics (ah, so that’s what temperature is!), and more importantly, I hadn’t been minding my p’s and q’s! I may go back now and see if I can read some of this quantropy stuff.

Sorry for a silly comment. Just a note of appreciation. :-)

Hi Todd – great to hear from you! If you like those analogies (and how can anyone resist: it’s like having a secret pocket map of many branches of science!), you can read more about them at week290. Indeed one reads about them here and there, especially the hydraulic analogy, which Heaviside derisively called the ‘drain-pipe theory’ of electrical circuits—and perhaps even more these days, the analogy between electrical circuits and systems of masses and springs. But the people who really develop them seriously are the engineers who work on systems built from a mix of electrical, mechanical, hydraulic, chemical and thermodynamic

components. They need a unified language!

These engineers tend to emphasize that has dimensions of

power, and they consider this equation to be very important:It’s called the

power balance equationand it’s a kind of equation of ‘nonconservation of energy’: it tells you the rate at which energy is leaving our system. is, as usual, the energy or ‘Hamiltonian’ as a function of the momentum and position variables. So if you’ve seen Hamilton’s equations, you’ll instantly notice that the power balance equation looks like some strange parody of those. Hamilton’s equations, after all, imply that energy is conserved:So what the heck is going on???

Briefly, I think the engineers have generalized Hamiltonian mechanics to ‘open systems’: systems that interact with the outside world, like a toaster with that you can plug into a power socket, or a stereo amplifier with places in the back where you can insert jacks for signals coming and going from your speaker, CD player and so on. For open systems energy is not conserved because it can enter and leave. You can draw these systems using pictures like electrical circuits, which category theorists like you call ‘string diagrams’.

So, I realized that open systems of the sort described by that table are actually morphisms in some symmetric monoidal category, and I decided to study that category. I decided to start by considering electrical circuits made only of resistors, and I showed that this particular category is very nicely described in terms of ‘Dirichlet forms’ and ‘Lagrangian subspaces of symplectic vector space’. I said how in “week297”: that was the climax of a certain story I’d been telling in This Week’s Finds.

I’ve mostly finished a paper on that… but then, instead of marching on straightforwardly and adding other circuit elements (capacitors, inductors, etc.) to the mix, I got distracted and started studying stochastic Petri nets and Markov processes. But then it turned out they’re deeply related! I explained that in part 16 of my network theory posts. Briefly, if you’ve got an electrical circuit made of resistors, the electrical potential will in equilibrium obey an equation

similar to Laplace’s equation

but you can use the same operator (called a ‘Dirichlet operator’, just another way of thinking about a Dirichlet form) to set up an equation

similar to the heat equation

and this latter equation describes a Markov process, just as the heat equation describes Brownian motion. This is in fact well-known (at least by those who know it well).

Anyway, the newer stuff I’ve been doing on symplectic aspects of thermodynamics, and quantropy, are also parts of this story. But just sketching it here has made it clear how sprawling and disorganized it is. I need to explain it in a more well-organized way! I plan to write a bunch of papers on this once I get back from Singapore and get a new batch of grad students—students help me stay focused on one thing at a time, and provide a good excuse for explaining stuff.

Maybe the next year 2013 or at the end to this year 2012 you could really buy “a quantum memory”. Memristors are coming to the real world soon. At least, if the schedule is accomplished, Hynix and HP will be releasing memristive memories/memristors to the market. I read that they had already built transistor-memristor hybrid stuff. If they succeed, and I think so, a new revolution in memories are coming to replace the old-fashioned transistor.

A priori, a memristor remembers the last state, so, it implies we will have not to reboot our computers, if the memristor implementation is correct, turning off a complete OS will be as fast as turning on and off a light bulb! ( Chua dixit). See also this great paper: http://www.ece.tamu.edu/~huang/files/materials606/circuit.pdf

Chua promised a 2nd part, but, in a forum interaction, he told that due to the little feedback and the increasing duties he had acquired because of the HP discovery, he has not been able to write the sequel to that pretty work above.

Yes, I have read the posts you comment, and all the stuff you mention. I follow you since your early blogger times (I mean your TWF), and we have met before in the cyberspace. You have been one of the sources of inspiration for my upcoming blog and microblog on Physics, Mathematics and Science.

About the dissipative Lagrangian-like dynamics. I agree with you and you are right about I was meaning. I tend to think the dissipative lagrangian formalism could be related to an enlarged multilagrangian-like framework, likely using Nambu Mechanics! Nambu Mechanics has been recently succesfully applied to atmospheric dynamics, and, in case you don’t know, it has been useful in a reformulation of Maxwell equations as a Nambu-field system, here: arxiv.org/pdf/1011.5282

About the book, it could be a great project. Don’t neglect the idea if your visitors don’t increase exponentially! :P

“…if the memristor implementation is correct, turning off a complete OS will be as fast as turning on and off a light bulb!”

(I assume you mean turning on a complete OS, since turning off can typically be done, destructively, in about one second by pulling the plug.) The main problem with starting up computers is not due to the use of volatile memory technology. The problem is that economics drives OS manufacturers to create systems with bugs (especially in the presence of hardware failures, even those that are common or minor in nature). The existence of bugs makes for a low Mean Time Before Failure, which makes it necessary to “bootstrap” an OS from its ‘first principles’, initializing and rebuilding its complicated data structures over a time that can be measured in seconds or minutes. Fast startup (such as from Hibernate or Sleep power modes) will always require an occasional much slower Full Bootstrap startup procedure until software engineers are encouraged to use bug-lowering technologies (like “smart resource pointers” or large-scale program proving). Sorry if this is off-topic.

David Spector

Springtime Software

Thanks, David—that’s not off-topic as far as I’m concerned, it’s very interesting. I’ve always wondered why it took so darn long for a computer to do a full startup, and I’ve certainly noticed how when I keep ‘hibernating’ my computer it keeps working more and more poorly, sort of like my brain does when I keep failing to get a full night’s sleep.

Yes, I was meaning turning on! A typos, I am sorry, could you edit it John? David, you are right. But I think that as far as Technology evolves or whenever it has the next breakdown/leap, the starting up will increase its speed. There are already some computers implementing fast boot systems, so, I think not very differently to you. Bugs or correcting error codes will be more efficient and better, unless of course you build or write something like Windows Vista! lol About why bug-lowering technologies are not popular…I had not realized it what important it is! And since J.C.B. is now worried about how to save the world, we should have asked that question too! Maybe a new case of Technological Obsolescence? Is there some interest in that consumers have those inefficient a buggy systems then? What about their energy/power efficiency? I presume buggy systems consume more power than a low-buggy systems or they dissipate more heat into atmosphere, is it correct?I am not sure. I think as John, it is not off-topic. It is a very interesting and parallel topic ot all this stuff related to flux, entropy(information),electronics or hydraulics. Specially, since we are doomed to invest in technologies saving energy and to decrease those which are driving the global warming.

Amarashiki wrote:

It’s too late to edit it: a whole interesting conversation has grown up around that comment. Don’t worry, we all know what you meant.

I see there’s still interest in Jaynes’ maximum caliber principle.

In this context the following will be interesting: almost all physical quantities are ranked into 7 quantity tables, in which every cell is identified by a plankunit. There are only 4 pure electric quantities: electric charge, magnetic flux, and the ratios of those two: electric admittance ( = electric charge/magnetic flux) and electric impedance ( = magnetic flux/electric charge). The other quantities like electric vector potential, magnetic flux density, electric capacitance, magnetic permeability, magnetic moment e.d. are all mixed quantities: combinations of electric charge or magnetic flux together with length or time.

Rectification: in the tables the cells in which electrical impedance and electric admittance reside must be coloured red and not blue-red as they are now. http://fqxi.org/data/essay-contest-files/van_Gaalen_digital_analog_e_1.pdf

I found this delicious talk:

• Arnau Dòria-Cerezo and Dimitri Jeltsema, Port-Hamiltonian modeling of the memristor and the higher order elements.

your readers can be very interested as I am! It is so cool. Unfortunately, I can not find a Ph.D position on that topic. I would be interested. My search continues… :(

That’s a great talk! Everyone interested in analogies between different physical theories should look at it. Among other things, it describes a mechanical analogue of a memristor on page 11, and a mechanical analogue of a meminductor on page 20. It also discusses these gadgets using the theory of ‘port-Hamiltonian systems’, meaning Hamiltonian mechanics for open systems.

It should be possible for qualified people to PhD’s on this subject in department of engineering or applied mathematics. For example, Jeltsema is in a department of applied mathematics, and Jan Willems, another good researcher in this area, is in a department of electrical engineering.

I’m going to try to make these into subjects that ordinary ‘unapplied’ math grad students can work on, too.

By ‘this subject’ I meant open Hamiltonian systems, as studied by engineers under the name of ‘port-Hamiltonian systems’. It’s possible that Dirac manifolds and even Courant algebroids are relevant, but I haven’t seem people doing the things I have in mind using those. And I don’t want to publicly say what I have in mind until I write a paper about it, or at least some blog articles!

By the way, we should plan our your visit here. I’ve been meaning to email you about this! My summer calendar is gradually getting filled up: for example, I’ll be in Barcelona at a conference on the mathematics of biodiversity from June 18 to July 6, and today I discovered I may be spending a couple of weeks in Europe before that as well—it’s my big annual journey to the west. Before this interval, and after this interval until mid-September, I should be in Singapore most of the time… though I’ll probably nip down to Java for a bit. John Huerta and Jamie Vicary will be visiting me at points, but that shouldn’t be a problem.

The Dirac structures are the standard framework for port-Hamiltonian (open Hamiltonian not necessarily holonomic) systems. It seems to be well understood by the practitioners. Courant algebroids also show up, but mostly as trivial bundles over manifolds with Dirac structures. I have a student who tried to sort through this last summer.

We eventually discovered that it’s hard to think of port-Hamiltonian systems without having a background in standard (closed, holonomic) Hamiltonian systems. So we are trying to remedy this now.

I wrote to you about my visit by email.

John wrote:

This subject mostly exists under such names as “Courant algebroids” and “Dirac manifolds.” I wonder if I am misunderstanding / misinterpreting what you just said and you mean something else?

I am a theoretical physicist with a Master on Sc. in theoretical physics too from Spain. I am sending my CV to different places around the world. My problem are perhaps my career grades and my age (33,almost 34), this is why I am searching for “something” around the world.

Yesterday, I applied to an offer in plasma physics and other in “extra solar planets” detection. LOL. But, I love Mathematics (Indeed I love algebraic stuff too like division algebras, n-ary algebras, Nambu mechanics or clifford algebras) and Physics mainly.

Do you know some places where I could see specific offers? I use to watch brightrecruit, find a phD, tiptop , etc.

Do you know people working on Nambu mechanics?I recommend you to see the last news related to the rôle of the enstrophy functional. I think to remember Terence Tao has submitted a temptative proof of the Navier-Stokes Clay problem in which he uses “enstrophy” (not confuse with entropy) stuff. 92 pages and very technical, too long for me with that topic but your math-inclined readers could be interested. http://arxiv.org/abs/1108.1165

Cheers.

I don’t have much help to offer, but you can get an idea who is working on Nambu mechanics (or any other topic in math or physics) by doing an arXiv search.

Thank you John. It is a pity that my country is not investing in Science, or R+D, a national tragedy! I will see if I can find something out there, somehow. My job as acting High-School teacher is so boring in spite the passion I put on it. Hahaha. Anyway, things are not very different out there excepting the fact truly developed countries invest where future IS.

About dissipative systems in lagrangians (variational principles including dissipation), you should read about the work of Riewe and fractional calculus plus the stuff related to the Rayleigh function. It is a pity those works are not generally in the curriculum of many physicists! However, I agree with you in that the role of dissipation in classical dynamics is not understood yet well enough. There are some recent developments as well in dissipative Nambu mechanics pretty cool, but as far as I known, dissipation (energy loss due to friction) is really THE QUESTION to be understood in order to get a better understanding of Dark Matter and Dark Energy…Provided the vacuum is something like a medium where the Higgs field live and where friction of vacuum can be important too!