Getting a (very) high resolution timer in C#
Hello folks! I'm currently working on some numeric-integration related stuff (specifically, creating a fixed timestep system for a game), and in order to be able to correctly interpolate subframes between steps (and we're talking about 120-240 steps per second), I need some high resolution timer - preferably sub-milisecond precision!
Does anyone have recommendations for facilities in .NET that would allow me to get such a timer?
(Please ping me on reply, thank you in advance!)
22 Replies
https://learn.microsoft.com/en-us/windows/win32/procthread/multimedia-class-scheduler-service is the most accurate you can get, afaik
Multimedia Class Scheduler Service - Win32 apps
The Multimedia Class Scheduler service (MMCSS) enables multimedia applications to ensure that their time-sensitive processing receives prioritized access to CPU resources.
no idea if there are existing C# bindings
Are you sure? This looks more like CPU scheduling utilities rather than an actual timer itself
well my answer was going to be multimedia timers until i saw that those are deprecated and they recommend using this instead
I do know it's technically possible to use Win32 QueryPerformanceCounter, but I would generally prefer to use something native to C# if possible, since otherwise I'll have to re-write adaptations for ever platform I want to support
there is no cross-platform timing API with the resolution you're asking for afaik
Really? No option for microsecond-level timings?
Or at least milisecond decimals.
even if there were, it's likely that every single system your application is running on is not realtime
and you wouldn't be guaranteed that precision anyway
Well, my primary platform is Desktop computers for now
Be it Linux or Windows
you only have so much influence over when the OS schedules your threads to run
Maybe I could get away with a lower resolution actually? Since the main thing I need this for is interpolation between states between game ticks
Though even there we're looking at milisecond intervals.
i'm kind of curious what game needs interpolation when it's already supposedly running at 240 FPS
Well, it's just an example - I'd ideally like to be able to run at any tickrate without issues. A more likely target is 90 or 120 ticks per second
But we need to run the interpolation for every single drawn frame
Since that's how we can get deltas for actually interpolating stuff
Principally it's all based on this quite well known article: https://gafferongames.com/post/fix_your_timestep/
to clarify, you're looking for a way to trigger code to run in a precise high resolution way right?
because if you just want to measure time, use a
Stopwatch
No, I just want to get the time since the application started running as precise as possible
What's the precision on it?
platform dependent
the
Frequency
property will tell you how many ticks per second it can measure on the system it's running on
i want to say on modern windows you can expect 100ns resolutionThat seems good enough for me I'd think?
for a game it's far more than enough
Okay then I'll just use stopwatch, thank you!
on windows it uses a performance counter internally, so that's simpler than using the win32 apis yourself
nothing is really "running," it's just getting timestamps from that
Alright, makes sense, thank you!
at the times you're looking to measure the actual measurement affects the timing