nervous about command timing

I have a sample program (attached) that uses a command to infinitely sample
2 input channels on an NI PCI-6052E.  A temporary counter lets me stop after
a few iterations for testing.  The command structure passes the test in the
cmd.c demo code.  But I'm not sure it's doing what I think, because the
timing is all wrong.  The time taken to run x scans should be
x*scan_begin_arg, right?  There are x scans delayed to take place every
scan_begin_arg ns.  But the execution time of my program does not seem to
depend on x at all, and seems to track scan_begin_arg in us, not ns.  What's
going on?

 

My goal to have the most recent sample for each channel in a variable (no
storage of old values).  They need to update about every 50us.  Since I'd
also like to schedule AO writes on separate PCI DAQs using the same 6052E
timer, is there a way to just get a timer interrupt and then handle all the
analog I/O in the interrupt handler?  If that or a totally different
approach would work better, I'm open to ideas.

 

Michael Thomas

Received on 2004-09-30Z18:35:34