On Thu, Mar 15, 2012 at 4:39 PM, Guennadi Liakhovetski <[email protected]> wrote: > Hi all > > I stumbled across this code in spi-bitbang.c: > > list_for_each_entry (t, &m->transfers, transfer_list) { > ... > cs_change = t->cs_change; > ... > if (!cs_change) > continue; > ... > /* sometimes a short mid-message deselect of the chip > * may be needed to terminate a mode or command > */ > ndelay(nsecs); > bitbang->chipselect(spi, BITBANG_CS_INACTIVE); > ndelay(nsecs); > } > ... > > /* normally deactivate chipselect ... unless no error and > * cs_change has hinted that the next message will probably > * be for this chip too. > */ > if (!(status == 0 && cs_change)) { > ndelay(nsecs); > bitbang->chipselect(spi, BITBANG_CS_INACTIVE); > ndelay(nsecs); > } > > So, IIUC, on the first occurrance cs_change is interpreted as "true == > have to disable CD," whereas the second one does the opposite. Shouldn't > the latter one be inverted? > As documented in include/linux/spi/spi.h the behavior of cs_change changes depending upon whether the transfer is last or not in a message. I remember using the signal pattern output by the bitbang driver as the reference, while developing one for s3c64xx and I didn't see any behavior against what is documented.
-jassi ------------------------------------------------------------------------------ This SF email is sponsosred by: Try Windows Azure free for 90 days Click Here http://p.sf.net/sfu/sfd2d-msazure _______________________________________________ spi-devel-general mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/spi-devel-general
