- From: Ong Wee Liat <g0202550_at_nus.edu.sg>
- Date: Fri, 22 Oct 2004 20:43:27 +0800
Hi, I encountered a timing problem when i am sampling the A/D ports of the above card. My program uses comedi_command( ) and the structure of comedi_cmd is listed below, freq = 1000.0; num_scan = 100; num_chan = 1; cmd.subdev = subd; cmd.flags = 0; cmd.start_src = TRIG_NOW; cmd.start_arg = 0; cmd.scan_begin_src = TRIG_TIMER; cmd.scan_begin_arg = 1e9/freq; /* in ns */ cmd.convert_src = TRIG_TIMER; cmd.convert_arg = 5000; /* in ns */ cmd.scan_end_src = TRIG_COUNT; cmd.scan_end_arg = num_chan; /* number of channels */ cmd.stop_src = TRIG_COUNT; cmd.stop_arg = num_scan; cmd.chanlist = chanlist; cmd.chanlist_len = num_chan; I have used comedi_command_test to test that the above inputs are valid. I found that when i sampled 1 channel, the time taken from the start of sampling to the time all the samples were printed out on the screen was about 2.05 sec. When i changed the number of channel to 2, suprisingly, the time taken for the whole duration dropped to 1.026 sec. For 3 channels = 0.6837s , 4 channels = 0.51268s, 8 channels = 0.2568s. I think somehow my sampling frequencies was increased from my desired 1 kHz to a frequency = (no of channels)x1kHz as seen from the reduction in time taken. I checked the driver cb_pcidas64.c source code and saw that a check was performed on the sampling frequency using the convert_arg before comedi_command( ) was executed. But I think my sampling frequency value is well within the check requirement. For example, for 3 channels, the time interval between two scan is 1 000 000 ns while the time needed for 3 channels to be acquired within a scan = 3*5000 = 15 000 ns. Hence, i do not understand why my sampling frequency is changed by the program. Any suggestion will be greatly appreciated. Thanks wee
Received on 2004-10-22Z11:43:27