Gabriel, as I already said you're using a classifier for a regression problem. Please read the doc.
Maybe : http://scikit-learn.org/stable/tutorial/statistical_inference/supervised_learning.html this help. hint : replace LDA by sklearn.linear_model.Ridge for example. HTH Alex On Sat, Nov 9, 2013 at 7:05 PM, Gabriel Peschl <gabrielpes...@gmail.com> wrote: > Hi, > > I am trying to implement the LDA algorithm using the sklearn, in python > > The code is: > > import numpy as np > from sklearn.lda import LDA > > > > X = np.array ([[0.000000, 0.000000, 0.000000, 0.000000, 0.001550, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.201550, 0.011111, 0.077778, > 0.011111, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.092732, 0.000000, 0.000000, 0.000000, > 0.000000, 0.035659, 0.000000, 0.000000, 0.000000, > 0.000000, 0.066667, 0.000000, 0.000000, 0.010853, > 0.000000, 0.033333, 0.055556, 0.055556, 0.077778, > 0.000000, 0.000000, 0.000000, 0.268170, 0.000000, > 0.000000, 0.000000, 0.000000, 0.130233, 0.000000, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.034109, 0.077778, 0.055556, 0.011111, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.155388, 0.000000, 0.000000, 0.000000, 0.000000, > 0.181395, 0.000000, 0.000000, 0.000000, 0.000000, > 0.001550, 0.007752, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.011111, 0.088889, 0.033333, > 0.000000, 0.000000, 0.142857, 0.000000, 0.000000, > 0.000000, 0.000000, 0.093023, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000000, 0.009302, 0.010853, > 0.000000, 0.100000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.022222, 0.088889, 0.033333, 0.238095, > 0.000000, 0.000000, 0.000000, 0.000000, 0.032558, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.182946, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.022222, 0.077778, 0.055556, > 0.000000, 0.102757], > [0.000000, 0.000000, 0.000000, 0.000000, 0.001550, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.201550, 0.011111, 0.077778, > 0.011111, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.092732, 0.000000, 0.000000, 0.000000, > 0.000000, 0.035659, 0.000000, 0.000000, 0.000000, > 0.000000, 0.066667, 0.000000, 0.000000, 0.010853, > 0.000000, 0.033333, 0.055556, 0.055556, 0.077778, > 0.000000, 0.000000, 0.000000, 0.268170, 0.000000, > 0.000000, 0.000000, 0.000000, 0.130233, 0.000000, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.034109, 0.077778, 0.055556, 0.011111, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.155388, 0.000000, 0.000000, 0.000000, 0.000000, > 0.181395, 0.000000, 0.000000, 0.000000, 0.000000, > 0.001550, 0.007752, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.011111, 0.088889, 0.033333, > 0.000000, 0.000000, 0.142857, 0.000000, 0.000000, > 0.000000, 0.000000, 0.093023, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000000, 0.009302, 0.010853, > 0.000000, 0.100000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.022222, 0.088889, 0.033333, 0.238095, > 0.000000, 0.000000, 0.000000, 0.000000, 0.032558, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.182946, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.022222, 0.077778, 0.055556, > 0.000000, 0.102757]]) > > y = np.array ([[0.000000, 0.000000, 0.008821, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.179631, 0.010471, 0.036649, > 0.026178, 0.000000, 0.000000, 0.020942, 0.010471, > 0.000000, 0.109215, 0.000000, 0.000000, 0.060144, > 0.000000, 0.042502, 0.000000, 0.005613, 0.000000, > 0.000000, 0.018444, 0.000000, 0.000000, 0.013633, > 0.020942, 0.031414, 0.083770, 0.015707, 0.041885, > 0.041885, 0.057592, 0.010471, 0.233788, 0.000000, > 0.000000, 0.018444, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.090617, 0.000000, 0.000000, > 0.000000, 0.104250, 0.005236, 0.020942, 0.031414, > 0.000000, 0.000000, 0.010471, 0.015707, 0.005236, > 0.056314, 0.000000, 0.000000, 0.026464, 0.000000, > 0.004010, 0.000000, 0.031275, 0.007217, 0.036889, > 0.007217, 0.013633, 0.000000, 0.000000, 0.005236, > 0.047120, 0.057592, 0.015707, 0.010471, 0.047120, > 0.062827, 0.005236, 0.262799, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000802, 0.000000, 0.000000, > 0.000000, 0.001604, 0.000000, 0.052927, 0.000000, > 0.039294, 0.026178, 0.041885, 0.031414, 0.000000, > 0.000000, 0.041885, 0.073298, 0.000000, 0.308874, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.236568, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000000, 0.020942, 0.015707, > 0.000000, 0.029010, > 0.000000, 0.000000, 0.008821, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.179631, 0.010471, 0.036649, > 0.026178, 0.000000, 0.000000, 0.020942, 0.010471, > 0.000000, 0.109215, 0.000000, 0.000000, 0.060144, > 0.000000, 0.042502, 0.000000, 0.005613, 0.000000, > 0.000000, 0.018444, 0.000000, 0.000000, 0.013633, > 0.020942, 0.031414, 0.083770, 0.015707, 0.041885, > 0.041885, 0.057592, 0.010471, 0.233788, 0.000000, > 0.000000, 0.018444, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.090617, 0.000000, 0.000000, > 0.000000, 0.104250, 0.005236, 0.020942, 0.031414, > 0.000000, 0.000000, 0.010471, 0.015707, 0.005236, > 0.056314, 0.000000, 0.000000, 0.026464, 0.000000, > 0.004010, 0.000000, 0.031275, 0.007217, 0.036889, > 0.007217, 0.013633, 0.000000, 0.000000, 0.005236, > 0.047120, 0.057592, 0.015707, 0.010471, 0.047120, > 0.062827, 0.005236, 0.262799, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000802, 0.000000, 0.000000, > 0.000000, 0.001604, 0.000000, 0.052927, 0.000000, > 0.039294, 0.026178, 0.041885, 0.031414, 0.000000, > 0.000000, 0.041885, 0.073298, 0.000000, 0.308874, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.236568, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000000, 0.020942, 0.015707, > 0.000000, 0.029010 > ], > [0.000000, 0.000000, 0.008821, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.179631, 0.010471, 0.036649, > 0.026178, 0.000000, 0.000000, 0.020942, 0.010471, > 0.000000, 0.109215, 0.000000, 0.000000, 0.060144, > 0.000000, 0.042502, 0.000000, 0.005613, 0.000000, > 0.000000, 0.018444, 0.000000, 0.000000, 0.013633, > 0.020942, 0.031414, 0.083770, 0.015707, 0.041885, > 0.041885, 0.057592, 0.010471, 0.233788, 0.000000, > 0.000000, 0.018444, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.090617, 0.000000, 0.000000, > 0.000000, 0.104250, 0.005236, 0.020942, 0.031414, > 0.000000, 0.000000, 0.010471, 0.015707, 0.005236, > 0.056314, 0.000000, 0.000000, 0.026464, 0.000000, > 0.004010, 0.000000, 0.031275, 0.007217, 0.036889, > 0.007217, 0.013633, 0.000000, 0.000000, 0.005236, > 0.047120, 0.057592, 0.015707, 0.010471, 0.047120, > 0.062827, 0.005236, 0.262799, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000802, 0.000000, 0.000000, > 0.000000, 0.001604, 0.000000, 0.052927, 0.000000, > 0.039294, 0.026178, 0.041885, 0.031414, 0.000000, > 0.000000, 0.041885, 0.073298, 0.000000, 0.308874, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000000, 0.000000, 0.000000, > 0.236568, 0.000000, 0.000000, 0.000000, 0.000000, > 0.000000, 0.000000, 0.000000, 0.020942, 0.015707, > 0.000000, 0.029010 ] > > ]) > clf = LDA() > clf.fit(X,y) > print(clf.predict([1, 2])) > > But, I got the message error: > > clf.fit(X,y) > fac = 1. / (n_samples - n_classes) > ZeroDivisionError: float division by zero > > What I do to solve this error? > > I am using this version of the LDA, from SKLEARN > http://scikit-learn.org/stable/modules/generated/sklearn.lda.LDA.html > > > The second question is: > > Can I use sklearn.lda with 2 .txt files? Files have 68.830 kB and 174.317 > KB. The first one is a test file and the second is training file. > > How I can use them, some suggestion? > > Thank you very much! > > > ------------------------------------------------------------------------------ > DreamFactory - Open Source REST & JSON Services for HTML5 & Native Apps > OAuth, Users, Roles, SQL, NoSQL, BLOB Storage and External API Access > Free app hosting. Or install the open source package on any LAMP server. > Sign up and see examples for AngularJS, jQuery, Sencha Touch and Native! > http://pubads.g.doubleclick.net/gampad/clk?id=63469471&iu=/4140/ostg.clktrk > _______________________________________________ > Scikit-learn-general mailing list > Scikit-learn-general@lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/scikit-learn-general > ------------------------------------------------------------------------------ DreamFactory - Open Source REST & JSON Services for HTML5 & Native Apps OAuth, Users, Roles, SQL, NoSQL, BLOB Storage and External API Access Free app hosting. Or install the open source package on any LAMP server. Sign up and see examples for AngularJS, jQuery, Sencha Touch and Native! http://pubads.g.doubleclick.net/gampad/clk?id=63469471&iu=/4140/ostg.clktrk _______________________________________________ Scikit-learn-general mailing list Scikit-learn-general@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/scikit-learn-general