On 7 November 2015 at 00:39, Bjorn Andersson
<bjorn.anders...@sonymobile.com> wrote:
> On Fri 06 Nov 00:10 PST 2015, Ulf Hansson wrote:
>
>> On 6 November 2015 at 02:42, Bjorn Andersson <bj...@kryo.se> wrote:
>> > On Mon, Jul 6, 2015 at 4:53 AM, Ivan T. Ivanov <ivan.iva...@linaro.org> 
>> > wrote:
>> >> Ensure SDCC is working with maximum clock otherwise card
>> >> detection could be extremely slow, up to 7 seconds.
>> >>
>> >> Signed-off-by: Ivan T. Ivanov <ivan.iva...@linaro.org>
>> >> Reviewed-by: Georgi Djakov <georgi.dja...@linaro.org>
>> >> Acked-by: Stephen Boyd <sb...@codeaurora.org>
>> >> ---
>> >>
>> >> Changes since v0:
>> >> - s/falied/failed in warning message.
>> >>
>> >>  drivers/mmc/host/sdhci-msm.c | 5 +++++
>> >>  1 file changed, 5 insertions(+)
>> >>
>> >> diff --git a/drivers/mmc/host/sdhci-msm.c b/drivers/mmc/host/sdhci-msm.c
>> >> index 4a09f76..4bcee03 100644
>> >> --- a/drivers/mmc/host/sdhci-msm.c
>> >> +++ b/drivers/mmc/host/sdhci-msm.c
>> >> @@ -489,6 +489,11 @@ static int sdhci_msm_probe(struct platform_device 
>> >> *pdev)
>> >>                 goto pclk_disable;
>> >>         }
>> >>
>> >> +       /* Vote for maximum clock rate for maximum performance */
>> >> +       ret = clk_set_rate(msm_host->clk, INT_MAX);
>> >> +       if (ret)
>> >> +               dev_warn(&pdev->dev, "core clock boost failed\n");
>> >> +
>> >
>> > On my 8974AC devices this results in GCC_SDCC1_APPS_CLK changing from
>> > 100MHz to 200MHz for my eMMC. Unfortunately this results in the
>> > following error:
>> >
>> > [    5.103241] mmcblk0: retrying because a re-tune was needed
>> > [    5.109270] mmcblk0: error -84 transferring data, sector 5816322,
>> > nr 2, cmd response 0x900, card status 0xc00
>> >
>> > Looking at the board specification it's stated that these card should
>> > run in DDR50, so I've tried specifying "max-frequency" in the dt. I
>> > verified in sdhci_set_clock() that we get a divisor of 4, but the
>> > result is a repetition of:
>>
>> I don't follow. Are you saying that changing the clock frequency to
>> 200MHz caused the card to be initialized in HS200 mode instead of
>> DDR50?
>>
>
> No, we clock the sdhci block at 100MHz, the host->max_clk is 200MHz and
> the divisor in sdhci_set_clock() becomes 1. So if I read this correctly
> we're running HS200 at 100MHz.
>
> Bumping the clock rate to 200MHz at the block doesn't affect the max_clk
> and hence we're trying to run the bus at 200MHz.
>
> I therefor tried to just set "max-frequency" to 50MHz, getting the
> divider to be 4 and the below error.
>
> So I assume it just happened to work at 100MHz, but 200MHz is way off
> from the 50MHz the board is designed and tested for.
>
>
> Unfortunately I don't have the equipment to measure these assumptions :/

Ahh, I see.

It seems like a reasonable assumption that the controller can't cope
with a higher clock rate than 100 MHz as "input" clock. That would
then mean that there are different versions of the controller, as it
seems like for some version it's fine with 200MHz and for some 100MHz.

According to the DT compatible strings, *one* version is currently
supported, "qcom,sdhci-msm-v4"...

I see two viable solutions. One would be to limit the clock rate
depending on the version of the controller (new compatible strings
needs to be added). Another one would be to limit the clock rate by
using the existing DT binding for max-frequency, and thus do a
clk_set_rate(mmc->f_max) during probe.

Both examples are being used in other host drivers. The mmci host
driver does it the first way and the sdhci_bcm_kona does it in the
other way.

What do you think?

Kind regards
Uffe
--
To unsubscribe from this list: send the line "unsubscribe linux-arm-msm" in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to