Hi Ryan.
> I agree that there are some disadvantages to the approach of passing
> DatasetInfo into the constructor, but I think it's important to try and
> make the burden as light as possible on the users. So personally I
> think that even though this will cause some extra code and methods, it
>
On Tue, May 09, 2017 at 10:37:59AM +0500, Kirill Mishchenko wrote:
> Hi Ryan.
>
> >> My suggestion is to add another overload:
> >>
> >> HyperParameterOptimizer<...> h(data, datasetInfo, labels);
> >>
> >> This is because I consider the dataset information, which encodes the
> >> types of dimen
Hi Ryan.
>> My suggestion is to add another overload:
>>
>> HyperParameterOptimizer<...> h(data, datasetInfo, labels);
>>
>> This is because I consider the dataset information, which encodes the
>> types of dimensions, to be a part of the dataset. Not all machine
>> learning methods support a
Hi Ryan.
> My suggestion is to add another overload:
>
> HyperParameterOptimizer<...> h(data, datasetInfo, labels);
>
> This is because I consider the dataset information, which encodes the
> types of dimensions, to be a part of the dataset. Not all machine
> learning methods support a Dataset
On Wed, Apr 26, 2017 at 11:24:18AM +0500, Kirill Mishchenko wrote:
> Hi Ryan.
>
> > The key problem, like you said, is that we don't know what AuxType
> > should be so we can't call its constructor. But maybe we can adapt
> > things a little bit:
> >
> > template
> > struct Holder /* needs a bet
Hi Ryan.
> The key problem, like you said, is that we don't know what AuxType
> should be so we can't call its constructor. But maybe we can adapt
> things a little bit:
>
> template
> struct Holder /* needs a better name */
> {
> // This typedef allows us access to the type we need to construc
On Thu, Apr 20, 2017 at 11:38:32AM +0500, Kirill Mishchenko wrote:
> Hi Ryan.
>
> > However this makes it unwieldy to optimize over AuxType objects with
> > multiple parameters. Maybe a thought then is to pass something a little
> > more complex:
> >
> > std::tuple, std::array> t =
> >std::m
Hi Ryan.
> However this makes it unwieldy to optimize over AuxType objects with
> multiple parameters. Maybe a thought then is to pass something a little
> more complex:
>
> std::tuple, std::array> t =
>std::make_tuple({ 1.0, 2.0, 4.0 }, { 2.0, 3.0 });
> (I think the syntax is probably wrong
Hi Kirill,
Thanks for the response. I think this email chain is getting quite long
now, so sorry if there is a lot of reading to do. :)
On Mon, Apr 17, 2017 at 08:57:21AM +0500, Kirill Mishchenko wrote:
> Hi Ryan.
>
> > - Use template metaprogramming tricks to, given a type, expand all of
> >
Hi Ryan.
> - Use template metaprogramming tricks to, given a type, expand all of
> its constructor arguments into a list of numeric types. So say we
> had:
>
> Learner(double a, AuxType b)
> AuxType(double c, double d)
>
> we would ideally want to extract [double, double, double]
On Mon, Apr 10, 2017 at 11:13:50AM +0500, Kirill Mishchenko wrote:
> Hi Ryan,
>
> I think I’m starting to see your perspective of how grid search
> optimiser should be implemented. But some concerns remain.
Hi Kirill,
Sorry for the slow response.
> 1. Some information (precision) can be lost du
Hi Ryan,
I think I’m starting to see your perspective of how grid search optimiser
should be implemented. But some concerns remain.
1. Some information (precision) can be lost during conversions between integer
and floating-point values (e.g., during coding size_t value into a cell of
arma::mat
On Fri, Apr 07, 2017 at 10:26:45AM +0500, Kirill Mishchenko wrote:
> Hi Ryan.
>
> By now it is hard for me to imagine how to make grid search optimiser
> to have a similar interface to already implemented optimisers like SGD
> since they work in slightly different domains. I guess a reasonable
> i
Hi Ryan.
By now it is hard for me to imagine how to make grid search optimiser to have a
similar interface to already implemented optimisers like SGD since they work in
slightly different domains. I guess a reasonable interface for grid search
optimiser will allow such usage.
arma::mat data
On Sat, Apr 01, 2017 at 10:55:45AM +0500, Kirill Mishchenko wrote:
> Hi Ryan.
>
> I’m planning to implement the following functionality as a GSoC project:
> Measurements
> Accuracy
> Mean squared error
> Precision
> Recall
> F1
> Validation
> Simple validation (splitting data once with validation
Hi Ryan.
I’m planning to implement the following functionality as a GSoC project:
Measurements
Accuracy
Mean squared error
Precision
Recall
F1
Validation
Simple validation (splitting data once with validation set size specified by a
user)
K-fold cross validation
Hyper-parameter tuning
Grid search
On Wed, Mar 29, 2017 at 06:15:16PM +0500, Kirill Mishchenko wrote:
> Thanks for your answer, I was thinking about kind of the same solution.
>
> I have yet another question, an organisational one. There are several
> phases for evaluation during coding under the GSoC program. Namely,
> there are t
Thanks for your answer, I was thinking about kind of the same solution.
I have yet another question, an organisational one. There are several phases
for evaluation during coding under the GSoC program. Namely, there are three:
in the end of June, in the end of July and in the end of August. My q
On Tue, Mar 21, 2017 at 05:09:46PM +0500, Kirill Mishchenko wrote:
> Ryan,
>
> I’m working on a proposal for the idea, and wondering whether
> hyper-parameter module should be flexible enough to support metrics
> with different correlations. E.g., if we use accuracy as a metric,
> then we want to
Ryan,
I’m working on a proposal for the idea, and wondering whether hyper-parameter
module should be flexible enough to support metrics with different
correlations. E.g., if we use accuracy as a metric, then we want to find a
model that maximises this metric; on the other hand, if we want to us
On Wed, Feb 22, 2017 at 05:07:39PM +0500, Kirill Mishchenko wrote:
> Hi,
>
> my name is Kirill. I’m interested in the contribution to the project
> “Cross-validation and hyper-parameter tuning infrastructure”. I have
> already gone through some starting steps, like building the code and
> running
Hi,
my name is Kirill. I’m interested in the contribution to the project
“Cross-validation and hyper-parameter tuning infrastructure”. I have already
gone through some starting steps, like building the code and running a few ML
algorithms (more precisely, I have did it for Linear Regression and
22 matches
Mail list logo