>From what you are saying isn't the Jaccard distance for the multi-class
case equivalent to the (1-hammingloss). Where the hamming loss is the
average of places where the two vectors are different.
I want to understand what do your examples represent? Could you give an
example where the dimension is y is 2 x 3 because I getting confused on
what the 2 represents, is it the number of columns or number of rows?
--
sp
On Mon, May 9, 2016 at 7:44 PM, Maniteja Nandana <
maniteja.modesty...@gmail.com> wrote:
>
> On 9 May 2016 5:24 pm, "Shishir Pandey" <shishir...@gmail.com> wrote:
> >
> > This is what I am having trouble understanding. What does each dimension
> of the vector represent? I am thinking of it as follows:
> >
> > [label_1, label_2, ..., label_N]
> >
> > a characteristic vector would be something like [1, 1, 0, ..., 1, 0, 0]
> >
> > This represents weather label_i is present in the set or not? In that
> case the answer would be different. A 0 is the two sets would represent
> that the label is not present in either of the sets and hence the union
> would be smaller than the dimension of the vector.
> >
> > --
> > sp
> >
>
> Sorry if I am misunderstanding here but in case you are referring to multi
> label classification here, something like an array of [[0,0],[0,1]] would
> be the value of predicted outputs, but an array of [0,1,3,2] would
> represent multi class output and is used in the example in discussion.
> Regarding the size of union, in multiclass and binary, intersection size
> is the number of times predicted class is same by total number of outputs.
> Here 0 need not mean the absence of class.
> But in multi label 0 means absence of label and in this case jaccard
> similarity is calculated for each output and weighted mean is calculated.
>
> In following, the first output has only one label in ground truth while
> two in prediction. While in the second example has the first output which
> has only one label in both ground truth and prediction.
>
> y_true = [[0, 1], [1, 1]]
> y_pred = [[1, 1], [1, 1]]
> jaccard_similarity_score
> 0.75 #(0.5 + 1)/2
>
> y_true = [[0, 1], [1, 1]]
> y_pred = [[0, 1], [1, 1]]
> jaccard_similarity_score
> 1#(1+1)/2
>
> Hope it helps.
>
> Regards,
> Maniteja.
> _________________
> Scikit-learn-general mailing list
> Scikit-learn-general@lists.sourceforge.net
> https <https://lists.sourceforge.net/lists/listinfo/scikit-learn-general>
> ://lists.sourceforge.net/lists/
> <https://lists.sourceforge.net/lists/listinfo/scikit-learn-general>
> listinfo
> <https://lists.sourceforge.net/lists/listinfo/scikit-learn-general>
> /scikit-learn-general
> <https://lists.sourceforge.net/lists/listinfo/scikit-learn-general>
>
>
> ------------------------------------------------------------------------------
> Find and fix application performance issues faster with Applications
> Manager
> Applications Manager provides deep performance insights into multiple
> tiers of
> your business applications. It resolves application problems quickly and
> reduces your MTTR. Get your free trial!
> https://ad.doubleclick.net/ddm/clk/302982198;130105516;z
> _______________________________________________
> Scikit-learn-general mailing list
> Scikit-learn-general@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/scikit-learn-general
>
>
------------------------------------------------------------------------------
Find and fix application performance issues faster with Applications Manager
Applications Manager provides deep performance insights into multiple tiers of
your business applications. It resolves application problems quickly and
reduces your MTTR. Get your free trial!
https://ad.doubleclick.net/ddm/clk/302982198;130105516;z
_______________________________________________
Scikit-learn-general mailing list
Scikit-learn-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/scikit-learn-general