flat assembler
Message board for the users of flat assembler.
Index
> Main > problem with sign Goto page Previous 1, 2 |
Author |
|
Tomasz Grysztar 30 Oct 2013, 13:50
revolution wrote: Indeed PI was proved irrational a long time ago (...) |
|||
30 Oct 2013, 13:50 |
|
revolution 30 Oct 2013, 14:45
Tomasz Grysztar wrote:
|
|||
30 Oct 2013, 14:45 |
|
Tomasz Grysztar 30 Oct 2013, 15:14
revolution wrote: Okay, seems that we need to normalise it then. Just to be sure. |
|||
30 Oct 2013, 15:14 |
|
revolution 30 Oct 2013, 15:24
Tomasz Grysztar wrote: This may not be easy if one wanted to have a proven result. For example the binary Liouville number 0.110001000000000000000001... is irrational and transcendental, like Pi - however it would not be easy to normalize it. For example the "Von Neumann whitening" would produce a sequence of identical bits out of it. |
|||
30 Oct 2013, 15:24 |
|
hopcode 01 Nov 2013, 18:11
tips-or-trick, its something like a LCG and what i like most:
it is reversible (after tweaking it a bit) it is multiplicative results seems promising but theory is not yet implemented in the way i mean. it is very slow, at the moment. i share only results, because i would like to peel the potato myself alone. and thats funny Code: ------------------------------------- Entropy = 7.989643 bits per byte. Optimum compression would reduce the size of this 16384 byte file by 0 percent. Chi square distribution for 16384 samples is 234.31, and randomly would exceed this value 81.92 percent of the times. Arithmetic mean value of data bytes is 126.1403 (127.5 = random). Monte Carlo value for Pi is 3.194139194 (error 1.67 percent). Serial correlation coefficient is 0.002701 (totally uncorrelated = 0.0). ------------------------------------- Entropy = 7.994758 bits per byte. Optimum compression would reduce the size of this 32768 byte file by 0 percent. Chi square distribution for 32768 samples is 238.28, and randomly would exceed this value 76.65 percent of the times. Arithmetic mean value of data bytes is 126.8322 (127.5 = random). Monte Carlo value for Pi is 3.152536166 (error 0.35 percent). Serial correlation coefficient is -0.002910 (totally uncorrelated = 0.0). -------------------------------------- Entropy = 7.997050 bits per byte. Optimum compression would reduce the size of this 65536 byte file by 0 percent. Chi square distribution for 65536 samples is 268.51, and randomly would exceed this value 26.85 percent of the times. Arithmetic mean value of data bytes is 126.7138 (127.5 = random). Monte Carlo value for Pi is 3.155099799 (error 0.43 percent). Serial correlation coefficient is 0.001993 (totally uncorrelated = 0.0). --------------------------------------- Entropy = 7.998566 bits per byte. Optimum compression would reduce the size of this 131072 byte file by 0 percent. Chi square distribution for 131072 samples is 260.91, and randomly would exceed this value 38.62 percent of the times. Arithmetic mean value of data bytes is 126.9442 (127.5 = random). Monte Carlo value for Pi is 3.145433738 (error 0.12 percent). Serial correlation coefficient is 0.002855 (totally uncorrelated = 0.0). --------------------------------------- Entropy = 7.999312 bits per byte. Optimum compression would reduce the size of this 262144 byte file by 0 percent. Chi square distribution for 262144 samples is 251.13, and randomly would exceed this value 55.68 percent of the times. Arithmetic mean value of data bytes is 127.0865 (127.5 = random). Monte Carlo value for Pi is 3.145342184 (error 0.12 percent). Serial correlation coefficient is 0.001801 (totally uncorrelated = 0.0). --------------------------------------- Entropy = 7.999609 bits per byte. Optimum compression would reduce the size of this 524288 byte file by 0 percent. Chi square distribution for 524288 samples is 284.21, and randomly would exceed this value 10.09 percent of the times. Arithmetic mean value of data bytes is 127.3320 (127.5 = random). Monte Carlo value for Pi is 3.139080578 (error 0.08 percent). Serial correlation coefficient is 0.000191 (totally uncorrelated = 0.0). ---------------------------------------- Entropy = 7.999843 bits per byte. Optimum compression would reduce the size of this 1048576 byte file by 0 percent. Chi square distribution for 1048576 samples is 228.00, and randomly would exceed this value 88.70 percent of the times. Arithmetic mean value of data bytes is 127.3755 (127.5 = random). Monte Carlo value for Pi is 3.140202103 (error 0.04 percent). Serial correlation coefficient is -0.001312 (totally uncorrelated = 0.0). ---------------------------------------- Entropy = 7.999920 bits per byte. Optimum compression would reduce the size of this 2097152 byte file by 0 percent. Chi square distribution for 2097152 samples is 233.88, and randomly would exceed this value 82.45 percent of the times. Arithmetic mean value of data bytes is 127.4360 (127.5 = random). Monte Carlo value for Pi is 3.143305915 (error 0.05 percent). Serial correlation coefficient is -0.000797 (totally uncorrelated = 0.0). ------------------------------------ Entropy = 7.999957 bits per byte. Optimum compression would reduce the size of this 4194304 byte file by 0 percent. Chi square distribution for 4194304 samples is 252.43, and randomly would exceed this value 53.36 percent of the times. Arithmetic mean value of data bytes is 127.4706 (127.5 = random). Monte Carlo value for Pi is 3.143122810 (error 0.05 percent). Serial correlation coefficient is -0.000971 (totally uncorrelated = 0.0). ------------------------------------ Entropy = 7.999976 bits per byte. Optimum compression would reduce the size of this 8388608 byte file by 0 percent. Chi square distribution for 8388608 samples is 273.55, and randomly would exceed this value 20.28 percent of the times. Arithmetic mean value of data bytes is 127.4586 (127.5 = random). Monte Carlo value for Pi is 3.144422327 (error 0.09 percent). Serial correlation coefficient is -0.000550 (totally uncorrelated = 0.0). it stays always max 1 standard deviation from the mean. i dint play with it much,i would like to take some time to think about it again befor the NIST tests. the results you see in this thread are the only i tried. generally, dont LOL please now because i dont know if it is possible, the idea as follow: i start with an hypercube of 32 dimensions on a stream of pi. because a hypercube of 0-dimension is a point, and a hypercube of 1-dimension is a line, if i "translate" the actual dimension of my hypercube i should get back a new 32+1 dimensions hypercube. ok,i "translate" it, considering the minimum distance among the old points an the new ones. it should be in some way (i think) a constant distance. my seed works on the rules for the convergence of p (our hypercube is a special one born from pi) as method to keep distances constant. the sum of the old points value in the old dimension and the new ones should be always stay in the expansion of digits of pi. if this sums are correct, they will not break the relation. we would have always the same constant distance but new and newer dimensions stacked on the initial vector, getting back for free all properties from pi (the goal) it is a bit like a sci-fi-algo, i know, but i like so much it. i tested it only once 16MB data with diehard-tests and it seems it passed them all ! Cheers, _________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
01 Nov 2013, 18:11 |
|
revolution 02 Nov 2013, 00:05
Actually, the world is not hurting for a new PRNG. There are already lots of great ones out there (and there a lot of crappy ones out there also). It is just a matter of pointing people to the right places to help them see that the first PRNG they encounter is not the only type available.
The other point that needs mentioning is that passing all the various tests (like diehard et al) is no guarantee of anything. Lots of PRNGs can pass all the tests. What is more important is the implementation. If it is implemented badly then you lose all the advantages and won't even know it. |
|||
02 Nov 2013, 00:05 |
|
hopcode 02 Nov 2013, 16:46
revolution wrote: ...There are already lots of great ones out there... i have started this thread on x64lab messageboard Randomness reloaded. if someone interested, you may visit it. Some assembly is required i will attach there links,useful papers,some template of my own to start with,and general results, from time to time... at ease... no hurry. Cheers, _________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
02 Nov 2013, 16:46 |
|
hopcode 11 Nov 2013, 05:35
...Morning...
applying my analysis R script on RND sources, and on Fibonacci (as example) against my "pi-cubed" idea for a first visual plotted comparison. links/tools there on http://board.x64lab.net/viewtopic.php?p=96#p96 _________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
11 Nov 2013, 05:35 |
|
hopcode 12 Nov 2013, 18:24
added Fibonacci LFSR 32bit assembly implementation and test.
Compile it with fasm http://board.x64lab.net/viewtopic.php?f=2&t=80 ok, i dont update in anymore stuff about "Randmness reloaded". you know th link Happy tweaking _________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
12 Nov 2013, 18:24 |
|
Goto page Previous 1, 2 < Last Thread | Next Thread > |
Forum Rules:
|
Copyright © 1999-2025, Tomasz Grysztar. Also on GitHub, YouTube.
Website powered by rwasa.