Time a function
Time a function
You are encouraged to solve this task according to the task description, using any language you may know.
You are encouraged to solve this task according to the task description, using any language you may know.
What time does it take to execute a `function' with a given `arguments'.
Use a timer with the least granularity available on your system.
What caveats are there?
This task is intended as a subtask for Measure relative performance of sorting algorithms implementations.
Python
Interpreter: Python
Given function and arguments return a time (in microseconds) it takes to make the call.
Note: There is an overhead in executing a function that does nothing.
def usec(function, arguments): modname, funcname = __name__, function.__name__ timer = timeit.Timer(stmt='%(funcname)s(*args)' % vars(), setup='from %(modname)s import %(funcname)s; args=%(arguments)r' % vars()) try: t, N = 0, 1 while t < 0.2: t = min(timer.repeat(repeat=3, number=N)) N *= 10 microseconds = round(10000000 * t / N, 1) # per loop return microseconds except: timer.print_exc(file=sys.stderr) raise
def nothing(): pass def identity(x): return x
Example
>>> print usec(nothing, []) 1.7 >>> print usec(identity, [1]) 2.2 >>> print usec(pow, (2, 100)) 3.3 >>> print map(lambda n: str(usec(qsort, (range(n),))), range(10)) ['2.7', '2.8', '31.4', '38.1', '58.0', '76.2', '100.5', '130.0', '149.3', '180.0']
where qsort() implemented on Quicksort page. Timings show that the implementation of qsort() has quadratic dependence on time for already sorted sequences (instead of O(N*log(N)) on average).