What is the shortest perceivable application response delay?

The 100 ms threshold was established over 30 yrs ago. See:

Card, S. K., Robertson, G. G., and Mackinlay, J. D. (1991). The information visualizer: An information workspace. Proc. ACM CHI'91 Conf. (New Orleans, LA, 28 April-2 May), 181-188.

Miller, R. B. (1968). Response time in man-computer conversational transactions. Proc. AFIPS Fall Joint Computer Conference Vol. 33, 267-277.

Myers, B. A. (1985). The importance of percent-done progress indicators for computer-human interfaces. Proc. ACM CHI'85 Conf. (San Francisco, CA, 14-18 April), 11-17.


What I remember learning was that any latency of more than 1/10th of a second (100ms) for the appearance of letters after typing them begins to negatively impact productivity (you instinctively slow down, less sure you have typed correctly, for example), but that below that level of latency productivity is essentially flat.

Given that description, it's possible that a latency of less than 100ms might be perceivable as not being instantaneous (for example, trained baseball umpires can probably resolve the order of two events even closer together than 100ms), but it is fast enough to be considered an immediate response for feedback, as far as effects on productivity. A latency of 100ms and greater is definitely perceivable, even if it's still reasonably fast.

That's for visual feedback that a specific input has been received. Then there'd be a standard of responsiveness in a requested operation. If you click on a form button, getting visual feedback of that click (eg. the button displays a "depressed" look) within 100ms is still ideal, but after that you expect something else to happen. If nothing happens within a second or two, as others have said, you really wonder if it took the click or ignored it, thus the standard of displaying some sort of "working..." indicator when an operation might take more than a second before showing a clear effect (eg. waiting for a new window to pop up).


New research as of January, 2014:

http://newsoffice.mit.edu/2014/in-the-blink-of-an-eye-0116

...a team of neuroscientists from MIT has found that the human brain can process entire images that the eye sees for as little as 13 milliseconds...That speed is far faster than the 100 milliseconds suggested by previous studies...


At the San Francisco Opera house, we routinely setup precise delay setting for each of our speakers. We can detect 5 millisecond changes in delay times to our speakers. When you make such subtle changes, you change where the sound sources from. Often times we want sound to sound as if it's coming from someplace other than were the speakers are. Precise delay adjustments make this possible. Sound delays of 15 milliseconds are very obvious even to untrained ears because it radically shifts where the sound sources from. A simple test is to prove this is to play sound through multiple speakers, and have the subject close their eyes and point to where the sound is coming from. Now make a slight change in the delay time to one of the speakers of just a few milliseconds, and have the person point again to where the sound is coming from. Making changes in delay times is acoustically very similar to moving the actual speakers.