- From: Frank Mori Hess <fmhess_at_users.sourceforge.net>
- Date: Wed, 29 Oct 2003 09:34:35 -0600
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 On Tuesday 28 October 2003 06:35 pm, Kenneth Jacker wrote: > We're using comedi_data_read_delayed() on a MCC 1602/16 board in our > application with a recent CVS version of COMEDI. > > Two questions: > > o Though the ADC doesn't begin converting until "nano_seconds" > time elapses, does the function return immediately? Or, will > the function only return when the conversion is completed? when complete > > o What determines the minimal *usable* value of the "nano_seconds" > parameter? We want to delay for ~800ns on a 500MHz machine ... > should this work? > You can use 800ns. It will actually delay longer due to overhead and the fact that udelay() has an actual resolution of a microsecond. I wouldn't set the delay less than you actually need in an attempt to compensate though, since it's possible that some day in the future the overhead will be reduced. - -- Frank -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.0.6 (GNU/Linux) Comment: For info see http://www.gnupg.org iD8DBQE/n94L5vihyNWuA4URAi5yAKDcay5JVpbKmFjXys0eRJIq/Vt5TQCghyN8 a8pqwg73ZJJoQ7P0kkNKvic= =w9p2 -----END PGP SIGNATURE-----
Received on 2003-10-29Z15:34:35