flat assembler
Message board for the users of flat assembler.
![]() |
Author |
|
Roman
I use timeGetTime.
My time jump from 25 ms to 15 ms ! How fix this ? Windows 7 My video: http://www.youtube.com/watch?v=VRQ3m7SrS5I |
|||
![]() |
|
revolution
Roman wrote: I use timeGetTime. If you want a high precision timer there are other APIs available like QueryPerformanceCounter. |
|||
![]() |
|
Roman
QueryPerformanceCounter gives the same result
![]() Problem in Windows. Windows switches to streams and programms. |
|||
![]() |
|
Roman
Fasm compiled my programm to RealTime priority. But its not help.
|
|||
![]() |
|
revolution
Roman wrote: QueryPerformanceCounter gives the same result |
|||
![]() |
|
Roman
revolution
QueryPerformanceCounter measures time in nanoseconds. QueryPerformanceCounter gives more precise time. But Windows jump to another streams and programms and this gives trash Milliseconds. If I used the wrong QueryPerformanceCounter i would have received is not correct time. But i get the same time as gives timeGetTime. The only difference is QueryPerformanceCounter gives milliseconds and nanoseconds. |
|||
![]() |
|
revolution
QPC is for short times. Trying to use it for long times will give problems as you saw.
Besides the crystals oscillators used in the mobo are not precise enough to give high resolution times over long periods. This is physical attribute that can't be fixed by software. |
|||
![]() |
|
Frank
Did you try this already:
Quote: You can use the timeBeginPeriod and timeEndPeriod functions to increase the precision of timeGetTime. Source: http://msdn.microsoft.com/en-us/library/windows/desktop/dd757629%28v=vs.85%29.aspx If not, then also look up the timeGetDevCaps function, it gives you the system's supported resolutions (minimum, and maximum) in milliseconds. |
|||
![]() |
|
Roman
Frank
Yes. I use timeBeginPeriod 1 and timeEndPeriod I read about this in msdn. But this not help me. ![]() I try invoke timeGetDevCaps,GG,8 And get Min=1 and Max=1000000 |
|||
![]() |
|
Frank
Strange.
I use timeGetTime for response time measurement in psychology experiments. The datasets show fine-grained response times: I see 427ms, 428ms, 429ms, and so on. Not multiples of higher values (such as 10ms or 15ms). As a high-level overview: I start with timeGetDevCaps to find Window's minimum resolution, then I set it with timeBeginPeriod. Next come SetPriorityClass (to HIGH_PRIORITY_CLASS) and SetThreadPriority (to THREAD_PRIORITY_TIME_CRITICAL). This is for the program as a whole. The response times are then measured as the difference between two timeGetTime results. Two more things are possibly relevant. First, in addition to timeGetTime, I use timeSetEvent for various timeout timers in the program (not during the actual measurement, but close-by). Perhaps that is somehow needed as an enabling condition that I accidentally got right? Second, my program (and the whole computer) has literally nothing else to do during the time measurement, it simply waits for a key press or mouse movement. Your program seems much more busy. Can you perhaps try these timing things without graphics-heavy loads? |
|||
![]() |
|
Roman
Frank and revolution
Thanks ! My videocapture prog (Debut Video Capture Software) saved mpeg4 video and for this reason time jumps ! I test my programm without Debut Video Capture Software and get stable 16~15 milliseconds ! |
|||
![]() |
|
< Last Thread | Next Thread > |
Forum Rules:
|
Copyright © 1999-2020, Tomasz Grysztar. Also on GitHub, YouTube, Twitter.
Website powered by rwasa.