While working on a BSP based on the nRF51 SoC, I noticed what may be an issue with the current division of config data between the 'bsp' and the 'mcu' code.

For the nRF51 (and likely the nRF52, I haven't looked yet) the LF clock source is defined in 'mcu/nordic/nrf51xxx/src/hal_os_tick.c':

        /* Turn on the LFCLK */
        NRF_CLOCK->XTALFREQ = CLOCK_XTALFREQ_XTALFREQ_16MHz;
        NRF_CLOCK->TASKS_LFCLKSTOP = 1;
        NRF_CLOCK->EVENTS_LFCLKSTARTED = 0;
        NRF_CLOCK->LFCLKSRC = CLOCK_LFCLKSRC_SRC_Xtal;
        NRF_CLOCK->TASKS_LFCLKSTART = 1;

The XTAL is hard coded as a source, meaning that a 32kHz crystal must be present on the board for the LF clock to work. This is really a board level choice, though, and we have a number of boards in production that left the 32kHz crystal off to control costs at the tradeoff of slightly higher power consumption, simulating the LF clock with the mandatory 16MHz XTAL (which is a valid choice with the nRF51 SoC).

The real question, of course, is if clock source or PLL setup decisions (frequency, multipliers, etc.) should be defined in the 'mcu' or migrated to the 'bsp' level? Hard coding this without an easy override seems like an unnecessary restriction and will just push people to modify the global MCU code.

There is a small design challenge defining a generic interface for this, of course, since the clock or PLL config and setup is specific to individual silicon vendors and their own design decisions, but if there isn't already a mechanism to define clock source and setup at the BSP level (perhaps I simply missed it!), it's probably worth considering.

Before going to far into it though, perhaps there is already a mechanism I missed to the end.

Reply via email to