摘要： Response latency (RT) is a crucial measure in studying human behavior, representing the time between the onset of a stimulus or task and the subject's response. The different time bases among visual stimulus generation devices and response collection devices introduces errors in RT measurements. These errors can be mitigated through the serial port synchronization signals, but limited information is available regarding their accuracy. This study aims to investigate methods for reducing the time errors to achieve accurate RT measurements. PsychToolbox generated visual stimuli and serial port synchronization signals, for comparison their timing. The findings revealed the following: Firstly, the serial synchronization signal presenting precedes visual stimulation, with a smaller lead time observed at higher refresh rates. Secondly, the lead time increases as the stimulus position deviates further to the right and downwards. Additionally, in Linux and IOPort(), serial port synchronization signals exhibited greater accuracy. Considering the inherent poor timing accuracy and the multiple influencing factors associated with serial port synchronization signals, it is recommended to employ light signal measuring the RT. However, differences in position between the light signal and visual stimuli introduce errors. To address this issue, a calibration formula was proposed and verified in this study. The results indicate that under the darkening process, the mean value of the time error is about -0.1ms. This accuracy enables precise calculation of the visual stimulus presentation moment, thereby obtaining accurate RT. This study offers valuable insights for optimizing experimental design and improving the accuracy of RT measurements.