python间隔5毫秒for_如何在Python中获得毫秒级和毫秒级的时间戳?

UPDATE: I finally figured this out and would like to share the knowledge and save someone a bunch of time, so see my answer below. I've finally figured it out for Linux too, including for pre-Python 3.3 (ex: for the Raspberry Pi), and I've posted my new module/code in my answer below.

Original question:

How do I get millisecond and microsecond-resolution timestamps in Python?

I'd also like the Arduino-like delay() (which delays in milliseconds) and delayMicroseconds() functions.

Note to the community:

Please don't mark this question as a duplicate and say it has an answer elsewhere when it definitely does not.

This question was incorrectly closed and marked as a duplicate of this one in approx. Sept. 2018 (see screenshot just below). It was then finally re-opened two years later on 23 Aug. 2020. Thank you for re-opening it! It's not a duplicate. That was a mistake. See more information on why just below.

It says, "This question already has an answer here." Unfortunately, that's just not true. I read those answers before asking this question, years ago, and they don't answer my question nor meet my needs. They are just as inapplicable to my question as is the most downvoted answer here, which is greyed out because it is unfortunately wrong since it relies on the time module, which prior to Python 3.3 did NOT have any type of guaranteed resolution whatsoever:

Please re-open my question. It is not a duplicate. It does not have a prior answer from another question. The question linked as already containing an answer relies on the time module, and even states its resolution is all over the place. The most upvoted answer there quotes a Windows resolution using their answer of 16 ms, which is 32000 times worse than my answer I provided here (0.5 us resolution). Again, I needed 1 ms and 1 us (or similar) resolutions, NOT 16000 us resolution. Therefore, it is not a duplicate.

Related:

[my own answer on how to do the same thing (get ms and us-resolution timestamps) in C++] Getting an accurate execution time in C++ (micro seconds)

解决方案

For Windows: Here's a fully-functional module for both Linux (works with pre-Python 3.3 too) and Windows:

Functions and code samples.

Functions include:

micros()

millis()

delay()

delayMicroseconds()

Python code module:

"""

GS_timing.py

-create some low-level Arduino-like millis() (milliseconds) and micros()

(microseconds) timing functions for Python

By Gabriel Staples

http://www.ElectricRCAircraftGuy.com

-click "Contact me" at the top of my website to find my email address

Started: 11 July 2016

Updated: 13 Aug 2016

History (newest on top):

20160813 - v0.2.0 created - added Linux compatibility, using ctypes, so that it's compatible with pre-Python 3.3 (for Python 3.3 or later just use the built-in time functions for Linux, shown here: https://docs.python.org/3/library/time.html)

-ex: time.clock_gettime(time.CLOCK_MONOTONIC_RAW)

20160711 - v0.1.0 created - functions work for Windows *only* (via the QPC timer)

References:

WINDOWS:

-personal (C++ code): GS_PCArduino.h

1) Acquiring high-resolution time stamps (Windows)

-https://msdn.microsoft.com/en-us/library/windows/desktop/dn553408(v=vs.85).aspx

2) QueryPerformanceCounter function (Windows)

-https://msdn.microsoft.com/en-us/library/windows/desktop/ms644904(v=vs.85).aspx

3) QueryPerformanceFrequency function (Windows)

-https://msdn.microsoft.com/en-us/library/windows/desktop/ms644905(v=vs.85).aspx

4) LARGE_INTEGER union (Windows)

-https://msdn.microsoft.com/en-us/library/windows/desktop/aa383713(v=vs.85).aspx

-*****https://stackoverflow.com/questions/4430227/python-on-win32-how-to-get-

absolute-timing-cpu-cycle-count

LINUX:

-https://stackoverflow.com/questions/1205722/how-do-i-get-monotonic-time-durations-in-python

"""

import ctypes, os

#Constants:

VERSION = '0.2.0'

#-------------------------------------------------------------------

#FUNCTIONS:

#-------------------------------------------------------------------

#OS-specific low-level timing functions:

if (os.name=='nt'): #for Windows:

def micros():

"return a timestamp in microseconds (us)"

tics = ctypes.c_int64()

freq = ctypes.c_int64()

#get ticks on the internal ~2MHz QPC clock

ctypes.windll.Kernel32.QueryPerformanceCounter(ctypes.byref(tics))

#get the actual freq. of the internal ~2MHz QPC clock

ctypes.windll.Kernel32.QueryPerformanceFrequency(ctypes.byref(freq))

t_us = tics.value*1e6/freq.value

return t_us

def millis():

"return a timestamp in milliseconds (ms)"

tics = ctypes.c_int64()

freq = ctypes.c_int64()

#get ticks on the internal ~2MHz QPC clock

ctypes.windll.Kernel32.QueryPerformanceCounter(ctypes.byref(tics))

#get the actual freq. of the internal ~2MHz QPC clock

ctypes.windll.Kernel32.QueryPerformanceFrequency(ctypes.byref(freq))

t_ms = tics.value*1e3/freq.value

return t_ms

elif (os.name=='posix'): #for Linux:

#Constants:

CLOCK_MONOTONIC_RAW = 4 # see here: https://github.com/torvalds/linux/blob/master/include/uapi/linux/time.h

#prepare ctype timespec structure of {long, long}

class timespec(ctypes.Structure):

_fields_ =\

[

('tv_sec', ctypes.c_long),

('tv_nsec', ctypes.c_long)

]

#Configure Python access to the clock_gettime C library, via ctypes:

#Documentation:

#-ctypes.CDLL: https://docs.python.org/3.2/library/ctypes.html

#-librt.so.1 with clock_gettime: https://docs.oracle.com/cd/E36784_01/html/E36873/librt-3lib.html #-

#-Linux clock_gettime(): http://linux.die.net/man/3/clock_gettime

librt = ctypes.CDLL('librt.so.1', use_errno=True)

clock_gettime = librt.clock_gettime

#specify input arguments and types to the C clock_gettime() function

# (int clock_ID, timespec* t)

clock_gettime.argtypes = [ctypes.c_int, ctypes.POINTER(timespec)]

def monotonic_time():

"return a timestamp in seconds (sec)"

t = timespec()

#(Note that clock_gettime() returns 0 for success, or -1 for failure, in

# which case errno is set appropriately)

#-see here: http://linux.die.net/man/3/clock_gettime

if clock_gettime(CLOCK_MONOTONIC_RAW , ctypes.pointer(t)) != 0:

#if clock_gettime() returns an error

errno_ = ctypes.get_errno()

raise OSError(errno_, os.strerror(errno_))

return t.tv_sec + t.tv_nsec*1e-9 #sec

def micros():

"return a timestamp in microseconds (us)"

return monotonic_time()*1e6 #us

def millis():

"return a timestamp in milliseconds (ms)"

return monotonic_time()*1e3 #ms

#Other timing functions:

def delay(delay_ms):

"delay for delay_ms milliseconds (ms)"

t_start = millis()

while (millis() - t_start < delay_ms):

pass #do nothing

return

def delayMicroseconds(delay_us):

"delay for delay_us microseconds (us)"

t_start = micros()

while (micros() - t_start < delay_us):

pass #do nothing

return

#-------------------------------------------------------------------

#EXAMPLES:

#-------------------------------------------------------------------

#Only executute this block of code if running this module directly,

#*not* if importing it

#-see here: http://effbot.org/pyfaq/tutor-what-is-if-name-main-for.htm

if __name__ == "__main__": #if running this module as a stand-alone program

#print loop execution time 100 times, using micros()

tStart = micros() #us

for x in range(0, 100):

tNow = micros() #us

dt = tNow - tStart #us; delta time

tStart = tNow #us; update

print("dt(us) = " + str(dt))

#print loop execution time 100 times, using millis()

print("\n")

tStart = millis() #ms

for x in range(0, 100):

tNow = millis() #ms

dt = tNow - tStart #ms; delta time

tStart = tNow #ms; update

print("dt(ms) = " + str(dt))

#print a counter once per second, for 5 seconds, using delay

print("\nstart")

for i in range(1,6):

delay(1000)

print(i)

#print a counter once per second, for 5 seconds, using delayMicroseconds

print("\nstart")

for i in range(1,6):

delayMicroseconds(1000000)

print(i)

If you know how to get the above millisecond and microsecond-resolution timestamps in Linux, please post, as that would be very helpful too.

This works for Linux too, including in pre-Python 3.3, since I'm using C functions via the ctypes module in order to read the time stamps.

Special thanks to @ArminRonacher for his brilliant pre-Python 3.3 Linux answer here: https://stackoverflow.com/a/1205762/4561887

Update: prior to Python 3.3, the built-in Python time library (https://docs.python.org/3.5/library/time.html) didn't have any explicitly high-resolution functions. Now, however it does provide other options, including some high-resolution functions.

My module above, however, provides high-resolution timestamps for Python code before Python 3.3, as well as after, and it does so on both Linux and Windows.

Here's an example of what I mean, showing that the time.sleep() function is NOT necessarily a high-resolution function. *On my Windows machine, it's resolution is perhaps 8ms at best, whereas my module above has 0.5us resolution (16000 times better!) on the same machine.

Code demonstration:

import time

import GS_timing as timing

def delayMicroseconds(n):

time.sleep(n / 1000000.)

def delayMillisecond(n):

time.sleep(n / 1000.)

t_start = 0

t_end = 0

#using time.sleep

print('using time.sleep')

print('delayMicroseconds(1)')

for x in range(10):

t_start = timing.micros() #us

delayMicroseconds(1)

t_end = timing.micros() #us

print('dt (us) = ' + str(t_end - t_start))

print('delayMicroseconds(2000)')

for x in range(10):

t_start = timing.micros() #us

delayMicroseconds(2000)

t_end = timing.micros() #us

print('dt (us) = ' + str(t_end - t_start))

#using GS_timing

print('\nusing GS_timing')

print('timing.delayMicroseconds(1)')

for x in range(10):

t_start = timing.micros() #us

timing.delayMicroseconds(1)

t_end = timing.micros() #us

print('dt (us) = ' + str(t_end - t_start))

print('timing.delayMicroseconds(2000)')

for x in range(10):

t_start = timing.micros() #us

timing.delayMicroseconds(2000)

t_end = timing.micros() #us

print('dt (us) = ' + str(t_end - t_start))

SAMPLE RESULTS ON MY WINDOWS 8.1 MACHINE (notice how much worse time.sleep does):

using time.sleep

delayMicroseconds(1)

dt (us) = 2872.059814453125

dt (us) = 886.3939208984375

dt (us) = 770.4649658203125

dt (us) = 1138.7698974609375

dt (us) = 1426.027099609375

dt (us) = 734.557861328125

dt (us) = 10617.233642578125

dt (us) = 9594.90576171875

dt (us) = 9155.299560546875

dt (us) = 9520.526611328125

delayMicroseconds(2000)

dt (us) = 8799.3056640625

dt (us) = 9609.2685546875

dt (us) = 9679.5439453125

dt (us) = 9248.145263671875

dt (us) = 9389.721923828125

dt (us) = 9637.994262695312

dt (us) = 9616.450073242188

dt (us) = 9592.853881835938

dt (us) = 9465.639892578125

dt (us) = 7650.276611328125

using GS_timing

timing.delayMicroseconds(1)

dt (us) = 53.3477783203125

dt (us) = 36.93310546875

dt (us) = 36.9329833984375

dt (us) = 34.8812255859375

dt (us) = 35.3941650390625

dt (us) = 40.010986328125

dt (us) = 38.4720458984375

dt (us) = 56.425537109375

dt (us) = 35.9072265625

dt (us) = 36.420166015625

timing.delayMicroseconds(2000)

dt (us) = 2039.526611328125

dt (us) = 2046.195068359375

dt (us) = 2033.8841552734375

dt (us) = 2037.4747314453125

dt (us) = 2032.34521484375

dt (us) = 2086.2059326171875

dt (us) = 2035.4229736328125

dt (us) = 2051.32470703125

dt (us) = 2040.03955078125

dt (us) = 2027.215576171875

SAMPLE RESULTS ON MY RASPBERRY PI VERSION 1 B+ (notice that the results between using time.sleep and my module are basically identical...apparently the low-level functions in time are already accessing better-resolution timers here, since it's a Linux machine (running Raspbian)...BUT in my GS_timing module I am explicitly calling the CLOCK_MONOTONIC_RAW timer. Who knows what's being used otherwise):

using time.sleep

delayMicroseconds(1)

dt (us) = 1022.0

dt (us) = 417.0

dt (us) = 407.0

dt (us) = 450.0

dt (us) = 2078.0

dt (us) = 393.0

dt (us) = 1297.0

dt (us) = 878.0

dt (us) = 1135.0

dt (us) = 2896.0

delayMicroseconds(2000)

dt (us) = 2746.0

dt (us) = 2568.0

dt (us) = 2512.0

dt (us) = 2423.0

dt (us) = 2454.0

dt (us) = 2608.0

dt (us) = 2518.0

dt (us) = 2569.0

dt (us) = 2548.0

dt (us) = 2496.0

using GS_timing

timing.delayMicroseconds(1)

dt (us) = 572.0

dt (us) = 673.0

dt (us) = 1084.0

dt (us) = 561.0

dt (us) = 728.0

dt (us) = 576.0

dt (us) = 556.0

dt (us) = 584.0

dt (us) = 576.0

dt (us) = 578.0

timing.delayMicroseconds(2000)

dt (us) = 2741.0

dt (us) = 2466.0

dt (us) = 2522.0

dt (us) = 2810.0

dt (us) = 2589.0

dt (us) = 2681.0

dt (us) = 2546.0

dt (us) = 3090.0

dt (us) = 2600.0

dt (us) = 2400.0

Related:

[my own answer on how to do the same thing (get ms and us-resolution timestamps) in C++] Getting an accurate execution time in C++ (micro seconds)

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值