flat assembler
Message board for the users of flat assembler.
![]() Goto page Previous 1, 2, 3 Next |
Author |
|
hopcode 21 Dec 2011, 05:40
revolution wrote: is it easiest to generate the Uniform distribution (like all the generators here) and do post processing to alter the output to suit the desired target distribution. i think he means simply a good uniformed density of return values. http://en.wikipedia.org/wiki/Normal_distribution just for curiosity, the question of a baby: isnt'it simpler first calculating the "Normal" one ? _________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
![]() |
|
AsmGuru62 21 Dec 2011, 14:59
@hopcode: I looked at 'normal' distribution in Wikipedia and it resembles some kind of symmetrical curve (like a Bell Curve), so I assume that numbers are not generated in equal probability.
|
|||
![]() |
|
hopcode 21 Dec 2011, 23:12
AsmGuru62 wrote: @hopcode: I looked at 'normal' distribution in Wikipedia and it resembles some kind of symmetrical curve (like a Bell Curve), so I assume that numbers are not generated in equal probability. hi AsmGuru62, that's sanskrit for me. that is the reason why i asked. i can say now that when i see expectation i see chi-square. it's a very simple and basic concept that all distributions (inklusive Coke's distribution) share. for example in the Gaussian one, the "standard" as from Gauss, corresponds exactly to the chi-square. perhaps revolution can explain something about the subject, after being ready, before for Xmas i hope, fixing his flat tire by using the car jack, i hope ![]() perhaps tomorrow late i will post my modification on the main algo. i am experiencing lot of problems with gnuplot. it's so complex-ed and stupid! the MOD solution you adopted above has a very bad distribution. dont use it. i tested something like that two hours ago. Cheers, _________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
![]() |
|
AsmGuru62 22 Dec 2011, 00:47
Yes, I was actually against the MOD algo.
|
|||
![]() |
|
revolution 22 Dec 2011, 02:14
Usually we generate a uniform distribution with a PRNG. This means that if we graph the counts of each output number we expect to see a flat line.
Taking a simple example of a PRNG that generates numbers from 0 to 15 inclusive in a uniform dist. If we run our PRNG 16,000,000 times then we expect to see 1,000,000 0's and 1,000,000 1's ... up to 1,000,000 15's. (In most good quality PRNGs and RNGs we would usually see small deviations from this ideal expectation). Instead, let's say we want a normal distribution (a bell curve result). Then we need to modify our PRNG to generate more of some numbers and less of others. Like this: 2873 0's, 16286 1's, 72802 2's, 253970 3's, 690272 4's, 1461246 5's, 2409153 6's, 3093398 7's, 3093398 8's, 2409153 9's,1461246 10's, 690272 11's, 253970 12's, 72802 13's, 16286 14's, 2873 15's. (mean = 7.5, SD = 2) How we go about modifying our PRNG to create numbers with different distributions is left as an exercise for the reader. Last edited by revolution on 22 Dec 2011, 02:33; edited 1 time in total |
|||
![]() |
|
revolution 22 Dec 2011, 02:27
AsmGuru62 wrote: Yes, I was actually against the MOD algo. To take our simple example above of uniform distribution 0 to 15 inclusive and compare the mod method to the multiply method. mod method: output = input mod 10 multiply method: output = truncate ( input * 10 / 16 ) Code: input mod multiply 0 0 0 1 1 0 2 2 1 3 3 1 4 4 2 5 5 3 6 6 3 7 7 4 8 8 5 9 9 5 10 0 6 11 1 6 12 2 7 13 3 8 14 4 8 15 5 9 |
|||
![]() |
|
hopcode 22 Dec 2011, 03:49
Thank you for the good explanation. that confirmed exactly what i understood about x2 and other few things. now i may consider my algo in the category
of the "uniform" generators but not PRNG, when PRNG meaning exactly this http://en.wikipedia.org/wiki/Pseudorandom_number_generator or at least, that seems to be its main behaviour. i have improved now my x2 till up to a delta of +20% the standard randomness; that is for me almost satisfying, granted that uniformity remains to a good degree, i.e. indifferently for 1/2/4 bytes output (and presumedly 8 too ) anything else to keep as fundamental reference about interpreting diehard results ? Cheers, btw. i call it uniform but obviously it cannot be seen in my previous snippet. the new one i will post tomorrow would have some scrambling of the seed value in order to give more "stability", thus crunching the cyclical value obtained from RDTSC. _________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
![]() |
|
revolution 22 Dec 2011, 04:14
hopcode wrote: now i may consider my algo in the category Edit: Also it is conceivably possible to reproduce the entire sequence many times over. We could theoretically restart the computer system in the same state many times and always get the same output sequence. |
|||
![]() |
|
hopcode 22 Dec 2011, 04:26
anyway, that the difference between RNG and PRNG consists mainly in the distribution, a math function, i find it absurd.
because during testing one requires same-seeds -> same-outputs and in real cases one should call it "RNG", a mechanism acting/outputting on a subset of the numbers it should output but doesnt do by design !! because on if not so, it would be a PRNG ![]() also, being it a matter of mere terminology i fnd it total weird and even a bit typical GNU-C (read General-Not-Utterable-Code) _________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
![]() |
|
revolution 22 Dec 2011, 04:35
hopcode wrote: anyway, that the difference between RNG and PRNG consists mainly in the distribution, a math function, |
|||
![]() |
|
hopcode 22 Dec 2011, 04:37
revolution wrote: there is no random element used anywhere. The output sequence after seeding with RDTSC is wholly deterministic. The input seed with RDTSC is wholly predictable. It is theoretically possible to know the entire state of the CPU and memory system and thus it is possible to completely predict the output sequence. but i agree with you, teoretically. in fact, when system are being cracked, it happens on a crude and deterministic subset of that theorethical "whole", by which coders did their passionate perfect unbreakable code. _________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
![]() |
|
hopcode 22 Dec 2011, 04:42
revolution wrote:
indeed, but this is mere terminology for me, nonsensical. here the quote from Wikipedia Code:
A PRNG can be started from an arbitrary starting state using a seed state.
It will always produce the same sequence thereafter when initialized with that
state. The maximum length of the sequence before it begins to repeat is
determined by the size of the state, measured in bits...
_________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
![]() |
|
revolution 22 Dec 2011, 04:46
The Wikipedia quote is correct. A true RNG would never be able to be restarted from a known place, and it would never repeat.
All PRNGs must eventually repeat because the state (or input seed) is of limited size, and all PRNGs can be started (seeded) from a known place. |
|||
![]() |
|
hopcode 22 Dec 2011, 04:59
ok, revolution, you are certainly more expert then me.
i will post my algo tomorrow and you can tell us what you think, and the type. imho whether RNG or PRNG, or other, is a matter of terminology because both can fail on a finite state machine. thank you for your patience. Cheers, _________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
![]() |
|
revolution 22 Dec 2011, 05:05
hopcode wrote: and the type. hopcode wrote: imho whether RNG or PRNG, or other, is a matter of terminology because both can fail on a finite state machine. |
|||
![]() |
|
hopcode 23 Dec 2011, 00:20
revolution wrote: What do you mean by "fail on a finite state machine"? revolution wrote: If we assume a priori that our PRNG is indistinguishable from an RNG then it shouldn't matter ![]() revolution wrote: There is a fundamental difference between PRNG and RNG that I think you may not understand. you dont need to presume that people in front of you understand what you say and how. ergo, in your exceptional frame, because the subject looks plain and well understood from other people: "Fast and small multiplicative..." and my questions related to the comprehension of the values i posted, it is very desiderable for you to read the thread carefully again. ad then, as a final clause again revolution wrote: You can't make a PRNG into an RNG by tweaking parameters or algorithms. what you are saying, apart something, sounds really confusing. other people have different opinion from you, not biased from the same experiences you have. and now, because of that bad-presuming-behaviour, i will not post my code here today. na.. tell us, tell us if it is a RPG or RNPG, if you can! you are not alone, man. dont forget. Cheers, _________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
![]() |
|
AsmGuru62 23 Dec 2011, 15:20
He was saying that if generator is to be started with ANY numeric value - no matter what number it is AND next number is to be derrived from that first number with NO MATTER WHAT LOGIC - than it is a PSEUDO-RNG.
Now, think of a random generator where the numbers are produced by atoms colliding with each other - THAT generator would be a real RNG - not PSEUDO. Because, there is no way to predict in which sequence they will collide and affect other collisions. I was working on one of these hardware RNGs in the end of 1980s (in Russia). I was coding self-diagnostics tests for it in IBM-370 ASM - cool stuff! - And it was my 1st job after University. |
|||
![]() |
|
hopcode 27 Dec 2011, 14:29
Merry Xmas and Happy New coming 2012.
AsmGuru62 wrote: He was saying that if generator is to be started with ANY numeric value - no matter what number it is AND next number is to be derrived from that first number with NO MATTER WHAT LOGIC - than it is a PSEUDO-RNG. no,no,no. they teach you that verbal distinction. look at this screenshot. it's exactly an hardaware generation as you imagined above, 8kb, from http://www.fourmilab.ch/hotbits/ here, plotted http://sites.google.com/site/x64lab/hotbits8k_read8.gif and after ent, it shows nothing special, Code: Entropy = 7.978051 bits per byte. Optimum compression would reduce the size of this 8192 byte file by 0 percent. Chi square distribution for 8192 samples is 244.88, and randomly would exceed this value 66.45 percent of the times. Arithmetic mean value of data bytes is 128.1207 (127.5 = random). Monte Carlo value for Pi is 3.117948718 (error 0.75 percent). Serial correlation coefficient is -0.012211 (totally uncorrelated = 0.0). i am testing lot of other algos, among them a couple of clever ones from some friends. those algo reach "far better" or "analogous" performances than from hotbits random generator. in your case, drawing mountains without FPU, as i said, avoid using lc53. as alternative, you may download a 1kb hotbits random bytes (if you don trust yourself ![]() but, as in my tests i will publish as soon as i can, there's no need to use that 1kb random bytes. a timespamp has the same entropy for me, because we actually cannot conceive exactly what "random" means. Cheers, btw: you can find my Park-Miller-Carta implementation, getting rid of the final if-branching-correction at http://sites.google.com/site/x64lab/home/reloaded-algorithms/park-miller-carta-random-lcg _________________ ⠓⠕⠏⠉⠕⠙⠑ |
|||
![]() |
|
revolution 27 Dec 2011, 14:36
hopcode: Many good PRNGs are indistinguishable from RNGs but that does not make them RNGs. It is not merely a verbal thing, it is a fundamental difference. PRNGs must be seeded and they generate repeatable results, that is what makes them different from RNGs.
Edit: Also, some RNGs are really crappy and have bad bias and poor spreads, but they are still RNGs. And no matter how bad they are at producing bits they will always be RNGs. |
|||
![]() |
|
Goto page Previous 1, 2, 3 Next < Last Thread | Next Thread > |
Forum Rules:
|
Copyright © 1999-2025, Tomasz Grysztar. Also on GitHub, YouTube.
Website powered by rwasa.