hmm.. guess I can give it a try.. i currently optimizing with for loops.. > Den 1. maj 2017 kl. 05.19 skrev Joel Nothman <joel.noth...@gmail.com>: > > Unless I'm mistaken about what we're looking at, you could use something like: > > class ToMultiInput(TransformerMixin, BaseEstimator): > def fit(self, shapes): > self.shapes = shapes > def transform(self, X): > return [X.] > > tmi = ToMultiInput([single.shape for single in train_input]) > # this assumes that train_input is a sequence of ndarrays with the same first > dimension: > train_input = np.hstack([single.reshape(single.shape[0], -1) > for single in train_input]) > > GridSearchCV(make_pipeline(tmi, my_predictor), ...) > > > On 1 May 2017 at 11:45, Carlton Banks <nofl...@gmail.com > <mailto:nofl...@gmail.com>> wrote: > How … batchsize could also be 1, I’ve just stored it like that. > > But how do reshape me data to be a matrix.. thats the big question.. is > possible? > >> Den 1. maj 2017 kl. 02.21 skrev Joel Nothman <joel.noth...@gmail.com >> <mailto:joel.noth...@gmail.com>>: >> >> Do each of your 33 inputs have a batch of size 100? If you reshape your data >> so that it all fits in one matrix, and then split it back out into its 33 >> components as the first transformation in a Pipeline, there should be no >> problem. >> >> On 1 May 2017 at 10:17, Joel Nothman <joel.noth...@gmail.com >> <mailto:joel.noth...@gmail.com>> wrote: >> Sorry, I don't know enough about keras and its terminology. >> >> Scikit-learn usually limits itself to datasets where features and targets >> are a rectangular matrix. >> >> But grid search and other model selection tools should allow data of other >> shapes as long as they can be indexed on the first axis. You may be best >> off, however, getting support from the Keras folks. >> >> On 30 April 2017 at 23:23, Carlton Banks <nofl...@gmail.com >> <mailto:nofl...@gmail.com>> wrote: >> It seems like scikit-learn is not able to handle network with multiple >> inputs. >> Keras documentation states: >> >> You can use Sequential Keras models (single-input only) as part of your >> Scikit-Learn workflow via the wrappers found at >> keras.wrappers.scikit_learn.py <http://keras.wrappers.scikit_learn.py/>. >> >> But besides what the wrapper can do.. can scikit-learn really not handle >> multiple inputs?.. >> >> >>> Den 30. apr. 2017 kl. 14.18 skrev Carlton Banks <nofl...@gmail.com >>> <mailto:nofl...@gmail.com>>: >>> >>> The shapes are >>> >>> print len(train_input) >>> print train_input[0].shape >>> print train_output.shape >>> >>> 33 >>> (100, 8, 45, 3) >>> (100, 1, 145) >>> >>> 100 is the batch-size.. >>>> Den 30. apr. 2017 kl. 12.57 skrev Joel Nothman <joel.noth...@gmail.com >>>> <mailto:joel.noth...@gmail.com>>: >>>> >>>> Scikit-learn should accept a list as X to grid search and index it just >>>> fine. So I'm not sure that constraint applies to Grid Search >>>> >>>> On 30 April 2017 at 20:11, Julio Antonio Soto de Vicente <ju...@esbet.es >>>> <mailto:ju...@esbet.es>> wrote: >>>> Tbh I've never tried, but I would say that te current sklearn API does not >>>> support multi-input data... >>>> >>>> El 30 abr 2017, a las 12:02, Joel Nothman <joel.noth...@gmail.com >>>> <mailto:joel.noth...@gmail.com>> escribió: >>>> >>>>> What are the shapes of train_input and train_output? >>>>> >>>>> On 30 April 2017 at 12:59, Carlton Banks <nofl...@gmail.com >>>>> <mailto:nofl...@gmail.com>> wrote: >>>>> I am currently trying to run some gridsearchCV on a keras model which has >>>>> multiple inputs. >>>>> The inputs is stored in a list in which each entry in the list is a input >>>>> for a specific channel. >>>>> >>>>> >>>>> Here is my model and how i use the gridsearch. >>>>> >>>>> https://pastebin.com/GMKH1L80 <https://pastebin.com/GMKH1L80> >>>>> >>>>> The error i am getting is: >>>>> >>>>> https://pastebin.com/A3cB0rMv <https://pastebin.com/A3cB0rMv> >>>>> >>>>> Any idea how i can resolve this? >>>>> >>>>> >>>>> >>>>> _______________________________________________ >>>>> scikit-learn mailing list >>>>> scikit-learn@python.org <mailto:scikit-learn@python.org> >>>>> https://mail.python.org/mailman/listinfo/scikit-learn >>>>> <https://mail.python.org/mailman/listinfo/scikit-learn> >>>>> >>>>> >>>>> _______________________________________________ >>>>> scikit-learn mailing list >>>>> scikit-learn@python.org <mailto:scikit-learn@python.org> >>>>> https://mail.python.org/mailman/listinfo/scikit-learn >>>>> <https://mail.python.org/mailman/listinfo/scikit-learn> >>>> >>>> _______________________________________________ >>>> scikit-learn mailing list >>>> scikit-learn@python.org <mailto:scikit-learn@python.org> >>>> https://mail.python.org/mailman/listinfo/scikit-learn >>>> <https://mail.python.org/mailman/listinfo/scikit-learn> >>>> >>>> >>>> _______________________________________________ >>>> scikit-learn mailing list >>>> scikit-learn@python.org <mailto:scikit-learn@python.org> >>>> https://mail.python.org/mailman/listinfo/scikit-learn >>>> <https://mail.python.org/mailman/listinfo/scikit-learn> >>> >> >> >> _______________________________________________ >> scikit-learn mailing list >> scikit-learn@python.org <mailto:scikit-learn@python.org> >> https://mail.python.org/mailman/listinfo/scikit-learn >> <https://mail.python.org/mailman/listinfo/scikit-learn> >> >> >> >> _______________________________________________ >> scikit-learn mailing list >> scikit-learn@python.org <mailto:scikit-learn@python.org> >> https://mail.python.org/mailman/listinfo/scikit-learn >> <https://mail.python.org/mailman/listinfo/scikit-learn> > > > _______________________________________________ > scikit-learn mailing list > scikit-learn@python.org <mailto:scikit-learn@python.org> > https://mail.python.org/mailman/listinfo/scikit-learn > <https://mail.python.org/mailman/listinfo/scikit-learn> > > > _______________________________________________ > scikit-learn mailing list > scikit-learn@python.org > https://mail.python.org/mailman/listinfo/scikit-learn
_______________________________________________ scikit-learn mailing list scikit-learn@python.org https://mail.python.org/mailman/listinfo/scikit-learn