- From: Michael Thomas <Michael.Thomas_at_dynetics.com>
- Date: Thu, 30 Sep 2004 14:35:34 -0500
I have a sample program (attached) that uses a command to infinitely sample 2 input channels on an NI PCI-6052E. A temporary counter lets me stop after a few iterations for testing. The command structure passes the test in the cmd.c demo code. But I'm not sure it's doing what I think, because the timing is all wrong. The time taken to run x scans should be x*scan_begin_arg, right? There are x scans delayed to take place every scan_begin_arg ns. But the execution time of my program does not seem to depend on x at all, and seems to track scan_begin_arg in us, not ns. What's going on? My goal to have the most recent sample for each channel in a variable (no storage of old values). They need to update about every 50us. Since I'd also like to schedule AO writes on separate PCI DAQs using the same 6052E timer, is there a way to just get a timer interrupt and then handle all the analog I/O in the interrupt handler? If that or a totally different approach would work better, I'm open to ideas. Michael Thomas
Attachments
- application/octet-stream attachment: cmd2.c
Received on 2004-09-30Z18:35:34