Hi Luigi,

Actually my data has 621*1405 points and each point has 12 features. I made
it into a 2-D array and kmeans works well. The last time I ran it used 64G
RAM on a cluster. I don't know how much more RAM can I use.

BTW, 1502 issue is about Orange. Is it the same with sklearn?

Thanks.

On Thu, Nov 16, 2017 at 1:14 AM, Luigi Lomasto <
l.loma...@innovationengineering.eu> wrote:

> Hi Shiudan,
>
> You can try to see this link: https://github.com/
> biolab/orange3/issues/1502
>
> You have 3D dimensional problem, right? For each feature you have 12
> values, so probably your RAM is small. How much RAM has your pc?
> Let me know,
>
> Luigi
>
>
> Il giorno 16 nov 2017, alle ore 09:18, Shiheng Duan <shid...@ucdavis.edu>
> ha scritto:
>
> Hi all,
>
> I am doing cluster work and wanna use silhouette score to determine the
> number of clusters. But I got MemoryError when execute silhouette_samples.
> I searched it and found something related to numpy. But I cannot reproduce
> the numpy error. Is there any solution to it?
>
> The data is 621*1405*12.
>
> Thanks!
>
> _______________________________________________
> scikit-learn mailing list
> scikit-learn@python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
>
>
> _______________________________________________
> scikit-learn mailing list
> scikit-learn@python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
>
>
_______________________________________________
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn

Reply via email to