Misnomer in comedi_data_read_delayed()

There's slight confusion in comedi_data_read_delayed() with respect to
nanos vs micros...

The prototype for comedi_data_read_delayed() is the following:

int comedi_data_read_delayed( comedi_t *it, unsigned int subdev,
                              unsigned int chan, unsigned int range,
                              unsigned int aref, lsampl_t *data,
                              unsigned int nano_sec);


You'll notice that the last parameter, nano_sec, seems to imply that the
passed-in value should be the time to delay, in nanos.

However, this value is really microseconds, since internally comedi calls
linux's udelay() function which expects a delay time in microseconds, not
nanoseconds.

Is this the case?  At any rate can I humbly suggest changing the variable
name in the header file and any documentation describing this function to
micro_sec or something other than nano_sec...?

-Calin

Received on 2002-08-09Z16:33:03