Skip to Content.
Sympa Menu

patterns-discussion - [patterns-discussion] Physics Patterns

patterns-discussion AT lists.cs.illinois.edu

Subject: General talk about software patterns

List archive

[patterns-discussion] Physics Patterns


Chronological Thread 
  • From: "Mike Beedle" <beedlem AT e-architects.com>
  • To: <patterns-discussion AT cs.uiuc.edu>
  • Subject: [patterns-discussion] Physics Patterns
  • Date: Fri, 12 Mar 2004 02:24:38 -0600
  • Importance: Normal
  • List-archive: <http://mail.cs.uiuc.edu/pipermail/patterns-discussion/>
  • List-id: General talk about software patterns <patterns-discussion.cs.uiuc.edu>


[Dear friends, I am looking for feedback for a new tool and
a set of concepts useful to Physicists that at its root
embed "pattern" concepts: "Physics Patterns".]


I. Introduction -- Physics Patterns

I want to provide a tool or set of tools (tentatively
called "Physics Patterns") that:

facilitate the theoretical and computational
elementary particle Physicist cycle:

1) proposing a Lagrangian with a "valid" symmetries
2) deciding the fields and their interactions
3) writing the action
4) Evaluating the path integrals with Feynman style
propagators/diagrams.
5) evaluating cross sections (to be compared with experiments)

(The tool now in embryo state is based on:
Lisa: http://lisa.sourceforge.net
JLisa: http://jlisa.sourceforge.net
Maxima: http://maxima.sourceforge.net
MatLisp: http://matlisp.sourceforge.net
and runs on CMU Lisp. It is an interpreter that parses
a Lagrangian expressed in Maxima language, and some
initial conditions data and then spits formulas
and numbers back.)

Apocryphal stories of Richard Feynman are told saying that
he was able to compute path integrals in hours that would
take other Physicists days or even months. Well, this
tool in a sense, attempts to capture the "mind of Feynman",
abstract its computational patterns towards Lagrangians
and the evaluation of path integrals; and put it to
work in a general context (QED, QCD, Electroweak, Standard Model,
Gravity, String, Quantum Loop Gravity, etc.)



II. Detailed Explanation

More specifically, the tool aims at facilitating the above cycle
using "rule-oriented" algebraic and computational software
such that a "user", a theoretical and computational Physicist,
can:

1) propose a Lagrangian with a "valid" symmetries and
2) decide on the fields to be used (scalar, vector, tensor,
spinor, Grassmann, etc.), and
3) deciding on the nature of their interactions or couplings
based on assumptions about the exchange particles: mass, spin,
charge, color, charm, fermion number, boson number, etc.;
including renormalization group flow options.

This validation is facilitated through a rule-oriented analysis
of the mathematical symmetry patterns or a "proposed
Lagrangian" using Lisa and Maxima in a Lisp environment.

The exit condition is that the "proposed Lagrangian"
would be both "interesting but valid" in terms of its:
1) symmetries
2) the viability of the proposed Fields and
3) its exchange particles

(From a technical perspective, I might have to incorporate
or translate to Lisp/Lisa some of the guts of a tool
like GAP "Groups, Algorithms and Programming":
http://www-gap.dcs.st-and.ac.uk/~gap/.
However, in some previous papers I have shown some
equivalences of
1) "rule-oriented" systems,
2) group-theoretical approaches
3) pattern-oriented approaches, (Grenander, Alexander, etc.)
4) genetic systems
http://www.mikebeedle.com/pub/unified.pdf
http://www.livingmetaphor.org/pattern-languages-autocatalytic-
system.html
, so although *much more* can be said and one, at least
I am not coming empty-handed into this task :-)

GAP as well as some portions of Maxima, MatLisp, are
capable of evaluating matrix and tensor expressions
that are relevant to Physics so that actions in
different groups (U(1), SO(2), SU(2), SO(3),
SU(3), SO(N), SU(N)) on selected sets (complex scalars,
vectors, spinors, tensors, Grassmann, etc.) can
be evaluated both at the algebraic and computational
levels.

Once the Lagrangian is validated then the tool
also facilitates:

3) writing the action
4) Evaluating the path integrals with Feynman style
propagators/diagrams.
5) evaluating cross sections

This can be done through "rule-oriented" calculations of
path integrals, that help the "user" with the computation
of the evaluation of path integrals related to general
multi-particle scatterings, both in the:

a) algebraic sense (Maxima) with potential theoretical value,

and in

b) the computational sense in the evaluation or
probabilities and cross sections side (MatLisp/LAPACK),
with the potential of comparing the calculations to
experiments in accelerators (or other natural occurring
phenomena).

The "back-end" architecture consists of the encoding
of these rules, or at least most of the rules, for
the creation and evaluation of "general" Feynman
diagrams related to the scattering of "general"
particles with options for masses, spins, charges,
colors, charm, fermion number, boson number, etc.;
with the hope that much sophisticated multi-particle
Lagrangians can be evaluated ......

without even ever writing a formula for
the expansion of the path integral into
Feynman diagrams with propagators. (The only thing
required would be a configuration option to know
at what power to stop :-)

Every theoretician's nightmare
is to find and compute such an expansion. In fact,
it is my belief that many Ph.D. in Physics are
granted with what this tool will eventually be
able to do in possibly .... hours if not minutes.

Most of the rules above, both in the Lagrangian validation,
and in the evaluation of the path integrals for
different interactions Electroweak, Strong (primitive,
residual), Gravity are in fact _known_, so the initial
work would be more of an "encoding and testing job"
rather than a research work per se. (Thousands of
computations and experimental results abound!!)

My plan is to start with QED, then move to
the Electroweak expansion, then QCD and the Standard
Model, and eventually try things like String, LQG
(Loop Quantum Gravity).


III. RESEARCH

Up to this point, none of the above could be considered
"true research" -- because it would mostly be
"automation of tedious but *known* work" i.e. a theoretical
or computational Physicist with enough time would
calculate both the expressions and the cross sections
for a given setup.

However, the research side of it, once the "engine" is
built, would consist in the exploration of NEW options:

1) "Deep" Physics Patterns

First and foremost, the tool can facilitate finding
"hidden symmetries and NEW patterns", because it can be
used to explore and abstract "rules within the rules"
that would represent higher order "Physics Patterns"
(hence the tool's name!).

Basically, with a tool that is capable of finding
and testing symmetries it would be *EASY* to find
new symmetries in complex Lagrangians -- this could
in fact lead to the discovery of new conserved
quantities (i.e. Noether's theorem).

In time, the goal is to connect these patterns,
in autocatalytic chains, as described in:
http://www.livingmetaphor.org/pattern-languages-autocatalytic-
system.html
to understand the self-organizing life and dance
of Nature.


2) Unified Theories

As higher order patterns would be found, the tool
could also facilitate the exploration of ever more Unified
theories because the effort of proposing NEW valid
Unified Lagrangians and carrying all the calculations
through the end would be greatly reduced.


3) Exploration of Quantum Computing

Because the tool accepts generalized propagators,
quantum computing considerations can be built at
*any* level of structure, for any interacting
particles, therefore the entanglement of *any
particles* i.e. gluons, photons, gravitons, etc.
could be computed and explored.


4) Physics is Software

In light of the above patterns in 1) and Quantum Computing
explorations in 3), the tool could help provide clues or
explanations to support the worldview that
"Physics is Software", that particles and energy are
equivalent to "data/programs" transformed in a discrete
quantum space-time "memory/programs" depending on
information-related rules. One of the most interesting
questions is "where is the program?" (I have postulated
that they are in both in the evaluating space time
(memory/evaluator/computer) and in the particle
(data/program), suggesting a Lisp-like environment!!!):
http://www.physicsissoftware.com


5) Variants in the end-to-end process

Because the tool would look at the process end-to-end,
"configurable variants" of the tool could explore
things like:

* looking beyond the "harmonic paradigm" -- universally
accepted from QED, to LQG!, but harder to accept
as the "only unique option" long term.
* new gauge fixes
* avoidance of unnecessary gauge fixes
* new formalisms not dependent on "exponentials of
either Hamiltonians or Lagrangians", but other
functional choices.
* exploration of strong couplings
* derivative powers of space-time > 2, still conforming
to Lorentz invariance or other gauge invariances
* discrete space-time (ala LQG or Matrix String Theory)
* non-commutative geometries


6) Exploration of Dual theories

The tool would make it easier to show that different
theories are equivalent, both in the algebraic
and computational senses. So things like String
theory duality, LQG unifications, and comparisons
with QED, electroweak, QCD, etc.; could be done
or done faster.


7) Nonlinear Optics

(I had to include this one :-)

The tool can also help in the understanding of
Non-Linear Optics, by studying the fundamental
nonlinear scattering processes in greater detail.
(Both Yariv's and Shen's books provide with QED-like
analysis that could be further elaborated and
explored -- an area I could never explore when
I was a graduate student because of the complexity
and hence time-requirements of the subject.)

8) Condensed Matter Physics and Collective Phenomena.

Since condensed matter physics and other collective
phenomena also uses a great deal of similar
computational techniques, the tool can also be
used to explore this area with greater detail and
complexity i.e. Chern-Simons, Quantum Hall fluids,
superconductivity, topological field theory,
superfluids, Hawking radiation, critical phenomena,
phonons, solitons, vortices, monopoles, instantons, etc.


IV. APOLOGIA

I know all of the above is quite a mouthful, keep in mind
Rome was not created in one day but also that without a vision,
we wouldn't had gone to the Moon :-) ... So, yes,
it is a long-term project. Maybe longer than
my lifetime :-)

I will really appreciate any feedback anyone may have!


- Mike Beedle
http://www.mikebeedle.com

"I am always doing things I can't do, that's how I get to do them."

--Pablo Picasso





  • [patterns-discussion] Physics Patterns, Mike Beedle, 03/12/2004

Archive powered by MHonArc 2.6.16.

Top of Page