flat assembler
Message board for the users of flat assembler.
Index
> Peripheria > Emergent phenomena 
Author 

Tomasz Grysztar
Recently there has been a little buzz around the new paper by Erik Verlinde (excellently reported by Natalie Wolchover) on the new theory of emergent gravity that apparently can take a shot at explaining the phenomena attributed to dark matter in a different (and fascinating) way.
I must admit that I am overly excited by theories like this, since they hit right at my sweet spot. I have been fascinated by emergence of a complex structures from more basic rules ever since I first learned, as a teenager, about the Conway's Game of Life and Mandelbrot set (it is not a coincidence that both can be found in the examples in my assembler packages). These are purely mathematical constructions and I then studied theoretical mathematics myself, but later I started to appreciate more and more the emergent mechanisms that show up in other disciplines, like physics. I now hold the statistical mechanics in high regard, as it shows how a things like temperature and thermodynamics naturally emerge from the microscopic structure of matter through the statistical regularities. (I have once written a couple of texts for a friend where I tried to explain the emergence of temperature and pressure to someone with only very basic knowledge of physics. I though I could even share them here, but since they were written in Polish, they would probably end up being read through some kind of automatic translator  after a quick look at the maimed text that Google Translate made out of one of them I concluded that it is not the right direction.) We have been discussing such topics many times with Ender (usually over a coffee) and we played with idea that things like gravity, space and time might also be an emergent phenomena. But neither of us is proficient enough in physics to actually try pursuing such ideas on the current frontiers of science. All that we are left with then is to get excited when we see the works of people like Verlinde. Even when such ideas fail to provide a complete model of observed reality, they still may give an interesting insight when they demonstrate how some of the regularities found in real word can be generated from simpler constituents as an emergent feature. It then hints that the same mathematical structures may also show up in a different framework and produce similar regularities in other model. The other day I saw a brilliant MinutePhysics video which uses a wellchosen example to demonstrate how the same mathematical regularities may show up in a different models. When there is an idea how a given complex behavior may emerge from something simpler, it is highly probable that any better model would also contain a similar mathematical emergence in some form. I was never impressed by the modified Newtonian dynamics, since it was just trying to find an equation that would produce the results consistent with observation (though I understand the mindset that leads to such approach)  but when I see that Verlinde was able to derive the same equation as an emergent feature of some model, I am suddenly fascinated. The video I brought up above mentions the Bohmian mechanics and this is another topic that recently got my attention, mainly thanks to Veritasium and the "bouncing droplets" experiments that show some interesting analogies with quantum mechanics. The droplets provide only an incomplete analogy for selected quantum phenomena, but it is very refreshing to see how a previously mysterious behavior can be modeled as emerging from something simple. Seeing how a probability density similar to quantum mechanical can show up in a completely different setting makes one wonder if there is a mathematical emergence mechanism common to both of them. And the fullfledged Bohmian mechanics provides an interpretation of quantum mechanics that also is based on some emergent features. To give the predictions consistent with experiments, Bohmian mechanics rely on the socalled "quantum equilibrium", a property of statistical "mixedness", which would be reached naturally through a chaotic motions of the particles in a process similar to thermodynamics. This detail really made me pause and think that I should perhaps keep an eye on this theory. Our world is made from incredibly huge numbers of building blocks, perhaps everything that we observe is statistical to the core. 

10 Dec 2016, 19:53 

KevinN
Bohm wrote in one book: "Analysis creates the parts."


11 Dec 2016, 00:20 

ender
Quote:
God only knows how much coffee have been poured over these talks! This really is a very fascinating stuff. And it keeps me wondering: can any systematic method of finding an "underlying model" for any emergent behaviour exist? A system, roughly speaking, capable of reducing the pattern of Conus textile to, let's say, a particular cellular automata. Well, my hunch tells me this thing would be darn uncomputable... 

11 Dec 2016, 09:42 

Tomasz Grysztar
A new paper follows that demonstrates at least some agreement between Verlinde's theory and the experimental evidence. This is not much, as it is still applied to just one simple case for which Verlinde provided a mathematical model. For anything more complex, a much more complicated models would need to be derived. But perhaps such ones will also come.
I also found on the web a nice mathematical explanation what is an entropic force. This one is able to really stir the imagination. Though the example of the elasticity of polymers in the original Verlinde's paper was also very illustrative. 

16 Dec 2016, 20:00 

Furs
I think it's more likely to end up on PBSSpaceTime, but yeah it would be lovely, I need visuals/animation to intuitively visualize this stuff...


20 Dec 2016, 12:59 

Tomasz Grysztar
The concept of an elemental particle is definitely a confusing one. QFT, an important part of modern physics, tells us that the fields are fundamental, and particles are just "excited states" of these fields. And I read it as meaning exactly that particles are emergent entities, just like sounds that emerge from the vibrations of acustical medium, etc.
What strikes me there is that particles may therefore be much more volatile beings than we usually envision them. This ties to the problem that I always had with a concept of photon: if it is just a piece of an electromagnetic wave, then multiple photons are going to add together into a more complex, selfinterfering wave. How do we know how many photons are then "hidden" in this wave? The only way we could count them is by interacting with that wave, one photon at a time, extracting portions of energy until none is left. But wouldn't that mean that it's a bit of an arbitrary process, where it's the character of interactions that decided what kind of photons we extracted and how many of them were there? When I say it, with my limited expertise (I'm a mathematician, not a physicist), it may perhaps appear naive, but I've found that I'm not the only one asking such questions and that this is still something that might be worth looking into. When we start thinking of a particle not as something existing unequivocally in the underlying fields, but as their emergent property, then it's the interaction (a measurement  one might say) with the field that "produces" a particle, in a sense that the particle is a pattern in the field that's relevant for a specific outcome, and various interactions could extract different things from the samelooking source. If this sounds like it is related to the measurement problem, it's because it probably is. If a detection is simply an interaction with a portion of the field and a particle is just a pattern in the field that ends up involved in said interaction, then probability of detecting a particle at some position, given by quantum theories, could be understood quite literally  as nothing more but the probability of producing a specific outcome. So you could say that all there is are interfering fields, and "particles" are just results of experiments that we perform. Everything about these fields could be continuously distributed in space and the only thing that is discrete (or "quantized") would be the interactions. Each interaction is an event, so we can count events (for example, blips on the screen) and treat them as "particles", but that does not have to mean that the fields themselves consisted of individual entities. What I realized only recently is that when we start considering particles as "produced" at the time of detection, it may shine a new light* on the experiments breaking the Bell inequalities (and similar ones, like CHSH). Because if we consider that the properties of a particle that is produced as a result of experiment could depend not only on the source fields, but also to some degree on the state of the detector, then excess correlations in such experiments are no longer so mysterious. I've been coding my own scripts demonstrating how it is possible to break CHSH inequality with detectors that are only able to produce results (that is: particles) when there is enough resonance in the underlying field to count as one  and then I found out that it has already been done in a more serious effort by Andrei Khrennikov. I also found an amazing article by Sergey A. Rashkovskiy, which even provides a viable theory of classical fields producing the illusion of particles. Moreover, the same physicist wrote a series of articles (the second part is behind paywall, so some of you may not be able to read it) demonstrating that many of the famous mysteries of quantum mechanics could become less puzzling if we got rid of the concept of a "particle" existing as anything other that just an event (which we consider the result of a measurement). It goes as far as providing a classical fieldbased analog of quantum cryptography and it is a truly fascinating thing to read. All this made me a bit more confident that it is not just my misunderstanding of fundamental concepts and that the issues I raised could be valid. Perhaps the particles do not really exist as welldefined entities travelling through space. Perhaps photons and phonons have more in common than it is usually admitted, both being just an emergent phenomena in our interpretation of interactions. I've heard several times that that there has been very little actual progress in the foundations of physics in the recent decades, while other areas, like condensed matter physics, are continuing to advance. Maybe it's not a coincidence and actually a good thing. If all things we know are emergent phenomena of interacting fields, we may need all these new tools and more, to even have a shot at tackling foundational problems. And we may still have a long way to go. ________ * Pun intended. I do intend my puns, please forgive the bad ones. 

25 Nov 2021, 20:41 

sylware
I really like your way of "seeing"/"wording" the quantum fields.
It made me think of how the "movement in space and time of 2 interacting particules" (for instance 2 electrons) is predicted from interaction events: this is the integration of all interactions over time in this location. For 2 electrons, they "repel" each other, and you can see the point of highest probability to measure the locations of the electrons going further apart. Well... this is my understanding, from very very far "away"', of what I saw on the internet. Physicists are kind of stuck: it is now really hard to look for new physics (general relativity and the standard model are too "good"). 

26 Nov 2021, 14:41 

< Last Thread  Next Thread > 
Forum Rules:

Copyright © 19992020, Tomasz Grysztar. Also on GitHub, YouTube, Twitter.
Website powered by rwasa.