Saturday, February 20, 2010

Lovely pictures of Quarks and Gluons

The next two pictures of an electron striking a deuterium nucleus are part of an animation sequence from Jefferon Labs which you can see in full by clicking here.



Added since first posting:

By Jérôme :

My goodness, that looks like a man! Spooky, yet, just a coincidence, I'm sure.


Jérôme CHAUVET said...

It looks like video games. I feel like taking my joystick and shoot these quarks up :)

Steven Colyer said...

I love the way the electron is shown as a white bullet. A "smeer," or if you're Yiddish: a "schmear." Nobody knows what these things look like, actually. So they're all computer animations, but ... it's wonderful what computers are capable of these days, and they're only getting better.

What do you think of Lattice Gauge Theory? There's another branch of physics, along with non-linear dynamics, where supercomputers are an indispensable tool.

Jérôme CHAUVET said...

That kind of model shown there looks more probable to me than the ones you present in your post, but less fun for the eyes.

A lattice is a support in void for the physical phenomenon you are considering, which fits with the definition of aether. On one hand those models may appear as a progress since requiring supercomputers, but on the other hand the need for an aether is a step backward. Isn't it?

Steven Colyer said...

Well, it's not just the aether, but the difficulty in Lattice QCD the smaller you make the lattice spacing. There is much work to be done in the field, it is far from complete.

I bet the field sucks up quite a few government dollars though, yes?

I have added a picture from your Fermilab link, thank you for that.

Jérôme CHAUVET said...

I bet the field sucks up quite a few government dollars though, yes?

In fact, I would have thought of the contrary. Simulated natural systems are supposed to reduce the huge cost of experimental science, and believed to justify the money so far spent in pure theory. In my opinion, such researches are subjected to better fundings, though I do not actually know better about this fact in Physics.

In the Biochemistry field, simulation is becoming the standard. I don't see why it shouldn't be the same in Physics.


Steven Colyer said...

Everything costs less than a particle accelerator, to be sure.

But the supercomputer researchers draw salaries and require insurance; the computers they use are state of the art, and such computers aren't cheap, and then there's the cost of climate control.

I'm always reminded of a bygone age, when Einstein was asked what he required on his first day at IAS in Princeton. It was something to the effect of "Four things: pencils, paper, a desk or table upon which to write, and a wastepaper basket in which to file my mistakes."

Jérôme CHAUVET said...

Model computing will surely not kill theory discovering. Computers are merely some other form of experiment : they do dot provide real theoritical results, they only mimick the phenomenon in some numerical environment by applying previously discovered theoritical principles. Not a single computer has discovered a principle yet, which is still the very job of the human brain.

Back to Eistein's time, people computed functions with pens and papers after the equations they had mentally and philosophically invented. Now back to the present time, using computers is the same as having automated pens and larger paper sheets, but nothing has changed :)


Steven Colyer said...

Wonderful reply. Yes, I've thought of all of that. Here is my response, which hopefully will add to the discussion:

When I was in my undergrad days in the late 1970's, we took this wonderful course called "Computer Science." It was brand new at the undergrad engineering level, and wonderful. We would sit at these IBM punch card machines, submit our "batch" of punch cards to the computer geeks behind the counter, and half an hour later would get a printout. Hopefully, we were careful and didn't type a comma where a period was called for in our 35-line programs, because if so we got garbage. We would then have to "de-bug", a new word then, fix, re-submit, wait another half hour, and pray we didn't screw it up a second time.

If we did ... repeat iteration until you get it right. :-)

I don't look at Mathematics as a "Science," Jérôme, for to do so is to insult Mathematics. It is far more important than that ... it is a "language." Indeed, the ultimate one.

Logic, which is Plato's (the blogger) and Phil Warnell's field of study, is the ultimate and initial FOS in my opinion. Out of Logic grew Mathematics in one direction, and Science in another. It would remain for many dark ages past Humanism and through The Enlightenment for Newton, Descartes, and many others to re-combine the two.

But Computer Science is and always shall be "Logic"-based. It's the right field that came along at the right time. These truly are wonderful times we live in.

So, I am essentially agreeing with you, just giving a different take on it.

YES, computers are no more than tools. They are idiot savants, like Dustin Hoffman's character in "Rain Man." Low IQ, fast processing speed. They thus pair well with humans, who are the opposite (in comparison).

Jérôme CHAUVET said...

Nothing to add, well done!

Plato said...

which is Plato's (the blogger) field of study..... should only portray that I recognize the logic value in terms of computerization.

One of my interests, and no qualifications.

I mean sure I would have like to see subjective qualities of psychological models, that Venn logic and a transactional analysis could have been given some footing, but alas it will always remain subjective on the one hand, while steering toward an understanding of issues of entanglement "in one form or another."

We've just started here. Quantum Chlorophyll

As Jerome said, too which I agree, is that you need the human mind in order to make the breakthrough.

Computerization indeed is modeling and going into theoretical areas(5d perspective) married with the equations transformed to graphics does wonders to help us understand the theoretical perspective and movement in dynamical fields


Jérôme CHAUVET said...

Hi Plato,

Thank you for joining us.

The fact is that computers do not sense what would be worth discovering, as they have no interests in anything. It's up to you to program them, and as far as I know no self-programming computer was invented. Once programmed, a computer performs its tasks in loops, which unlike us do not seem to get bored with that. When trapped into a loop, the human brain seems to creatively find a way to escape from it, or to get depressed otherwise. I think this is what characterizes our intelligence.

But as a biochemist, I also know that our brain has computing abilities, and this is this ability which is used by Dr. Fromherz to interface neurons with electrical systems.

What is then the very difference between the human mind and computers?

It seems like it would be plasticity,i.e., the capacity of our brain to re-programm itself, and which allows our ideas to adapt to the moving pardigms of the world. When we are able to invent a computer endowed with plasticity, will we get a thinking computer? One able to discover scientific principles?

It is not impossible that a next generation of computers will kill our argument.


Jérôme CHAUVET said...

The failing link for Fromherz's site is now here

Plato said...

Jérôme:When trapped into a loop, the human brain seems to creatively find a way to escape from it, or to get depressed otherwise. I think this is what characterizes our intelligence.

Yes I agree with this too. It is interesting for me to see this development of the brain in relation too, "computer development." along side of the idea of "increasing intelligence and creativity." How is this possible unless it's choice as to alternatives can be increased as probable outcomes, that it could choose? A very large data base.

Coxeter's quote in terms of being a Platonist applies here?

Plasticity? I am not sure I understand this. Is it like a scintillator?

In Pioneering Study, Monkey Think, Robot Do By SANDRA BLAKESLEE,

In previous experiments, some in the same laboratory at Duke, both humans and monkeys have had their brains wired so they could move cursors on computer screens just by thinking about it. And wired monkeys have moved robot arms by making a motion with their own arms. The new research, however, involves thought-controlled robotic action that does not depend on physical movement by the monkey and that involves the complex muscular activities of reaching and grasping.


Monkey Moves Computer Cursor by Thoughts Alone, By E.J. Mundell

Going one step further, her team then trained the monkey to simply think about a movement, without reaching out and touching the screen. A computer program, hooked up to the implanted electrodes, interpreted the monkey's thoughts by tracking flare-ups of brain cell activity. The computer then moved a cursor on the computer screen in accordance with the monkey's desires--left or right, up or down, wherever ``the electrical (brain) pattern tells us the monkey is planning to reach,'' according to Meeker.


Jérôme CHAUVET said...

Plasticity? I am not sure I understand this.

Neuroplasticity is the most essential concept to be known about the brain of living entities.

When comparing computer and brain, plasticity appears as the very difference between them. It states that our neurons are endowed with the ability to weaken their activity when they are involved in some circuitry which is not quite useful (understand:not frequently used), and to strengthen their activity whenever they belong to circuitries which are frequently used for a certain mental activity. Actually, this is stricto sensu the ability to self-reprogram thoughts, which is what a computer has not.

Neurobiologists think that this explains our capacity to invent, as by reconnecting our neurons we would be able to see things in another way.


Steven Colyer said...

While I find this discussion very fascinating, I'm not sure it belongs on a page about pictures of quarks and gluons.

However I just made a new blog page in Plato and Jérôme's honor and in honor of this cutting edge subject: here.

The lesson of course being, if you work the logic long enough, Jérôme, you will eventually be compared to Plato. :-)