flat assembler
Message board for the users of flat assembler.

Index > Peripheria > Emergent phenomena

Author
Thread Post new topic Reply to topic
Tomasz Grysztar



Joined: 16 Jun 2003
Posts: 8349
Location: Kraków, Poland
Tomasz Grysztar 10 Dec 2016, 19:53
Recently there has been a little buzz around the new paper by Erik Verlinde (excellently reported by Natalie Wolchover) on the new theory of emergent gravity that apparently can take a shot at explaining the phenomena attributed to dark matter in a different (and fascinating) way.

I must admit that I am overly excited by theories like this, since they hit right at my sweet spot. I have been fascinated by emergence of a complex structures from more basic rules ever since I first learned, as a teenager, about the Conway's Game of Life and Mandelbrot set (it is not a coincidence that both can be found in the examples in my assembler packages). These are purely mathematical constructions and I then studied theoretical mathematics myself, but later I started to appreciate more and more the emergent mechanisms that show up in other disciplines, like physics. I now hold the statistical mechanics in high regard, as it shows how a things like temperature and thermodynamics naturally emerge from the microscopic structure of matter through the statistical regularities.

We have been discussing such topics many times with Ender (usually over a coffee) and we played with idea that things like gravity, space and time might also be an emergent phenomena. But neither of us is proficient enough in physics to actually try pursuing such ideas on the current frontiers of science. All that we are left with then is to get excited when we see the works of people like Verlinde.

Even when such ideas fail to provide a complete model of observed reality, they still may give an interesting insight when they demonstrate how some of the regularities found in real word can be generated from simpler constituents as an emergent feature. It then hints that the same mathematical structures may also show up in a different framework and produce similar regularities in other model.

The other day I saw a brilliant MinutePhysics video which uses a well-chosen example to demonstrate how the same mathematical regularities may show up in a different models. When there is an idea how a given complex behavior may emerge from something simpler, it is highly probable that any better model would also contain a similar mathematical emergence in some form. I was never impressed by the modified Newtonian dynamics, since it was just trying to find an equation that would produce the results consistent with observation (though I understand the mindset that leads to such approach) - but when I see that Verlinde was able to derive the same equation as an emergent feature of some model, I am suddenly fascinated.

The video I brought up above mentions the Bohmian mechanics and this is another topic that recently got my attention, mainly thanks to Veritasium and the "bouncing droplets" experiments that show some interesting analogies with quantum mechanics. The droplets provide only an incomplete analogy for selected quantum phenomena, but it is very refreshing to see how a previously mysterious behavior can be modeled as emerging from something simple. Seeing how a probability density similar to quantum mechanical can show up in a completely different setting makes one wonder if there is a mathematical emergence mechanism common to both of them. And the full-fledged Bohmian mechanics provides an interpretation of quantum mechanics that also is based on some emergent features. To give the predictions consistent with experiments, Bohmian mechanics rely on the so-called "quantum equilibrium", a property of statistical "mixedness", which would be reached naturally through a chaotic motions of the particles in a process similar to thermodynamics. This detail really made me pause and think that I should perhaps keep an eye on this theory.

Our world is made from incredibly huge numbers of building blocks, perhaps everything that we observe is statistical to the core.
Post 10 Dec 2016, 19:53
View user's profile Send private message Visit poster's website Reply with quote
KevinN



Joined: 09 Oct 2012
Posts: 160
KevinN 11 Dec 2016, 00:20
Bohm wrote in one book: "Analysis creates the parts."
Post 11 Dec 2016, 00:20
View user's profile Send private message Reply with quote
ender



Joined: 03 Nov 2004
Posts: 11
Location: London, UK
ender 11 Dec 2016, 09:42
Quote:

We have been discussing such topics many times with Ender (usually over a coffee)

God only knows how much coffee have been poured over these talks! Wink
This really is a very fascinating stuff. And it keeps me wondering: can any systematic method of finding an "underlying model" for any emergent behaviour exist?
A system, roughly speaking, capable of reducing the pattern of Conus textile to, let's say, a particular cellular automata.
Well, my hunch tells me this thing would be darn uncomputable... Smile
Post 11 Dec 2016, 09:42
View user's profile Send private message Reply with quote
Tomasz Grysztar



Joined: 16 Jun 2003
Posts: 8349
Location: Kraków, Poland
Tomasz Grysztar 16 Dec 2016, 20:00
A new paper follows that demonstrates at least some agreement between Verlinde's theory and the experimental evidence. This is not much, as it is still applied to just one simple case for which Verlinde provided a mathematical model. For anything more complex, a much more complicated models would need to be derived. But perhaps such ones will also come.

I also found on the web a nice mathematical explanation what is an entropic force. This one is able to really stir the imagination. Though the example of the elasticity of polymers in the original Verlinde's paper was also very illustrative.
Post 16 Dec 2016, 20:00
View user's profile Send private message Visit poster's website Reply with quote
Enko



Joined: 03 Apr 2007
Posts: 676
Location: Mar del Plata
Enko 19 Dec 2016, 21:50
Thanks for sharing the links Tomasz! very interesting info.
I would love for somebody like Veritasium, minutePhycisc, VSauce to make a video about this otherwise I don't think I am really able to comprehend fully the fine details of this theory.
Post 19 Dec 2016, 21:50
View user's profile Send private message Reply with quote
Furs



Joined: 04 Mar 2016
Posts: 2493
Furs 20 Dec 2016, 12:59
I think it's more likely to end up on PBSSpaceTime, but yeah it would be lovely, I need visuals/animation to intuitively visualize this stuff...
Post 20 Dec 2016, 12:59
View user's profile Send private message Reply with quote
Tomasz Grysztar



Joined: 16 Jun 2003
Posts: 8349
Location: Kraków, Poland
Tomasz Grysztar 25 Nov 2021, 20:41
The concept of an elemental particle is definitely a confusing one. QFT, an important part of modern physics, tells us that the fields are fundamental, and particles are just "excited states" of these fields. And I read it as meaning exactly that particles are emergent entities, just like sounds that emerge from the vibrations of acustical medium, etc.

What strikes me there is that particles may therefore be much more volatile beings than we usually envision them.

This ties to the problem that I always had with a concept of photon: if it is just a piece of an electromagnetic wave, then multiple photons are going to add together into a more complex, self-interfering wave. How do we know how many photons are then "hidden" in this wave? The only way we could count them is by interacting with that wave, one photon at a time, extracting portions of energy until none is left. But wouldn't that mean that it's a bit of an arbitrary process, where it's the character of interactions that decided what kind of photons we extracted and how many of them were there? When I say it, with my limited expertise (I'm a mathematician, not a physicist), it may perhaps appear naive, but I've found that I'm not the only one asking such questions and that this is still something that might be worth looking into.

When we start thinking of a particle not as something existing unequivocally in the underlying fields, but as their emergent property, then it's the interaction (a measurement - one might say) with the field that "produces" a particle, in a sense that the particle is a pattern in the field that's relevant for a specific outcome, and various interactions could extract different things from the same-looking source. If this sounds like it is related to the measurement problem, it's because it probably is. If a detection is simply an interaction with a portion of the field and a particle is just a pattern in the field that ends up involved in said interaction, then probability of detecting a particle at some position, given by quantum theories, could be understood quite literally - as nothing more but the probability of producing a specific outcome.

So you could say that all there is are interfering fields, and "particles" are just results of experiments that we perform. Everything about these fields could be continuously distributed in space and the only thing that is discrete (or "quantized") would be the interactions. Each interaction is an event, so we can count events (for example, blips on the screen) and treat them as "particles", but that does not have to mean that the fields themselves consisted of individual entities.

What I realized only recently is that when we start considering particles as "produced" at the time of detection, it may shine a new light* on the experiments breaking the Bell inequalities (and similar ones, like CHSH). Because if we consider that the properties of a particle that is produced as a result of experiment could depend not only on the source fields, but also to some degree on the state of the detector, then excess correlations in such experiments are no longer so mysterious.

I've been coding my own scripts demonstrating how it is possible to break CHSH inequality with detectors that are only able to produce results (that is: particles) when there is enough resonance in the underlying field to count as one - and then I found out that it has already been done in a more serious effort by Andrei Khrennikov. I also found an amazing article by Sergey A. Rashkovskiy, which even provides a viable theory of classical fields producing the illusion of particles. Moreover, the same physicist wrote a series of articles (the second part is behind paywall, so some of you may not be able to read it) demonstrating that many of the famous mysteries of quantum mechanics could become less puzzling if we got rid of the concept of a "particle" existing as anything other that just an event (which we consider the result of a measurement). It goes as far as providing a classical field-based analog of quantum cryptography and it is a truly fascinating thing to read.

All this made me a bit more confident that it is not just my misunderstanding of fundamental concepts and that the issues I raised could be valid. Perhaps the particles do not really exist as well-defined entities travelling through space. Perhaps photons and phonons have more in common than it is usually admitted, both being just an emergent phenomena in our interpretation of interactions.

I've heard several times that that there has been very little actual progress in the foundations of physics in the recent decades, while other areas, like condensed matter physics, are continuing to advance. Maybe it's not a coincidence and actually a good thing. If all things we know are emergent phenomena of interacting fields, we may need all these new tools and more, to even have a shot at tackling foundational problems. And we may still have a long way to go.

________
* Pun intended. I do intend my puns, please forgive the bad ones.
Post 25 Nov 2021, 20:41
View user's profile Send private message Visit poster's website Reply with quote
sylware



Joined: 23 Oct 2020
Posts: 437
Location: Marseille/France
sylware 26 Nov 2021, 14:41
I really like your way of "seeing"/"wording" the quantum fields.
It made me think of how the "movement in space and time of 2 interacting particules" (for instance 2 electrons) is predicted from interaction events: this is the integration of all interactions over time in this location. For 2 electrons, they "repel" each other, and you can see the point of highest probability to measure the locations of the electrons going further apart. Well... this is my understanding, from very very far "away"', of what I saw on the internet.

Physicists are kind of stuck: it is now really hard to look for new physics (general relativity and the standard model are too "good").
Post 26 Nov 2021, 14:41
View user's profile Send private message Reply with quote
Tomasz Grysztar



Joined: 16 Jun 2003
Posts: 8349
Location: Kraków, Poland
Tomasz Grysztar 18 Dec 2021, 14:13
This new video gives another voice of a specialist, confirming my own suspicion that all of the alleged weirdness of quantum mechanics disappears once it is accepted that the entire behavior of a particle depends on the "measurement" that is finally performed. And this kind of "superdeterminism" is exactly what would be the conclusion if it turned out that particles were not independent entities, but were defined by the detection process - a particle seen by a measurement being simply just a pattern in the field that ended up involved. With such interpretation it becomes understandable how all kinds of statistical selection bias could be introduced by a measurement process, so I have a strong feeling that it all adds up. Even if in reality it may not all be simple as I naively hope for, I'm convinced that the popular interpretations of fundamental physics are misleading, to say the least.
Post 18 Dec 2021, 14:13
View user's profile Send private message Visit poster's website Reply with quote
sylware



Joined: 23 Oct 2020
Posts: 437
Location: Marseille/France
sylware 20 Dec 2021, 23:07
The "spooky" action at a distance: "independance in space/time" of a significant change to a experiment (turning on a detector) with its outcome.


Last edited by sylware on 21 Dec 2021, 19:48; edited 1 time in total
Post 20 Dec 2021, 23:07
View user's profile Send private message Reply with quote
AsmGuru62



Joined: 28 Jan 2004
Posts: 1617
Location: Toronto, Canada
AsmGuru62 21 Dec 2021, 15:24
"Measurement changes the environment" -- Very good observation. It is like when you have a crash in multi-threaded code. You want to log what each thread is doing and after logs are made -- that crash (or a deadlock) disappears. Of course it does! -- Because the logging introduced the timing changes in the code!
Post 21 Dec 2021, 15:24
View user's profile Send private message Send e-mail Reply with quote
Tomasz Grysztar



Joined: 16 Jun 2003
Posts: 8349
Location: Kraków, Poland
Tomasz Grysztar 21 Dec 2021, 16:32
What I'm considering here is more tricky than simply "affecting the environment", it's more like "in the eye of beholder" thing. My point is, if a detector can only interact with specific patterns (interpreted as "particles"), it may see the same environment differently than another device would in its place. In extreme case two different types of measurements could "see" a different numbers of particles in the same underlying composite wave (I guess this might also break the "realism" assumption of the Bell's theorem). In consequence each detector would be introducing its own set of biases into what it sees, and that would disrupt statistics, creating weird correlations - which is exactly what we see in quantum experiments.

What this view has in common with "superdeterminism" (defined as in the linked material) is that if we look at the pattern corresponding to a detection event and trace back its history, we can interpret that as a history of a particle - but this would be a post-interpretation of some waving patterns in the field, and the interpretation could be different if we started with a different future event. So the properties of a particle truly depend on how it is detected in the future, but only because this affects which patterns in the field we interpret as corresponding to the particle. The underlying state of the field, full of composite waves, would itself not depend on the future, only our interpretation of it as containing such-and-such particles would change accordingly.
Post 21 Dec 2021, 16:32
View user's profile Send private message Visit poster's website Reply with quote
Tomasz Grysztar



Joined: 16 Jun 2003
Posts: 8349
Location: Kraków, Poland
Tomasz Grysztar 26 Dec 2021, 14:06
In other words, it's like seeing different shapes in the cloud. While the cloud itself remains the same, everyone might be seeing something different, depending on what they focus on, and in some sense everyone could be right. Remember the Yanny or Laurel illusion?

This makes the "observer effect" reach further than just a simple disturbance of the measured object caused by interaction with it. When we take what we observed and trace it back in time, the context makes us focus on specific patterns in the underlying turbulent "cloud" - we can discern a trajectory of a particle like we recognize a familiar shape in a noisy background, but it's just our abstract thinking and it does not affect the cloud itself. And that's how the setting of a measurement may affect the properties of a particle in the past, even though in reality it does not change anything.
Post 26 Dec 2021, 14:06
View user's profile Send private message Visit poster's website Reply with quote
Tomasz Grysztar



Joined: 16 Jun 2003
Posts: 8349
Location: Kraków, Poland
Tomasz Grysztar 27 Dec 2021, 13:30
The basis for this idea is already lurking in the modern physics: a coherent state is a wave-like phenomenon that may not contain a fixed number of particles, and its composition becomes defined by the interactions that ultimately happen to it. And even if the wave contains a fixed number of entities, the details of decomposition can only be derived once we know the results of measurements.

My supposition could then be summarized as applying similar reasoning to any complex state of the world, which is like a sum of tremendously many interfering waves. Such state could be mathematically decomposed into wave patterns corresponding to particles, but such decomposition would not necessarily be unique. I say that the presence of particles with such and such properties may be an artifact of interpretation, a post-rationalization of the events that we observe.

I'm surprised how such a simple idea would be able to tame all the weirdness associated with quantum mechanics, but seemingly it does. The properties of particles appear to be affected by how we measure them in the future, because they actually are - the context dictates which emergent entities in the underlying field we interpret as something that we later detected. A more fundamental deterministic theory of these waves may be possible after all - and it would be challenging only because of mathematical and computational complexity, not because of some innate strangeness.
Post 27 Dec 2021, 13:30
View user's profile Send private message Visit poster's website Reply with quote
OmegaSourcerer



Joined: 24 Mar 2014
Posts: 2
OmegaSourcerer 09 Apr 2022, 18:54
I think anyone reading this thread will really enjoy this...

The whole video is great BUT specifically from 4:00 to 10:00. Conway's game of life, literally brings itself to life. At 9:30 let me know if the hair raising epiphany set in. 8D
This is a wonderful visual on taking only a few rules and how an entire emergent system could literally be birthed.

https://www.youtube.com/watch?v=6avJHaC3C2U

There a section in this video on Mandelbrot as well but this Conway's section specifically was incredible. Enjoy!
Post 09 Apr 2022, 18:54
View user's profile Send private message Reply with quote
Display posts from previous:
Post new topic Reply to topic

Jump to:  


< Last Thread | Next Thread >
Forum Rules:
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You cannot attach files in this forum
You can download files in this forum


Copyright © 1999-2024, Tomasz Grysztar. Also on GitHub, YouTube.

Website powered by rwasa.