On 19.07.2011 01:32, Ted Dunning wrote:
Well... it makes it uncomputable in explicit form. Sometimes there are
implicit forms for the matrix that keeps the size in bounds. For instance,
limited rank decompositions (aka truncated singular value decompositions)
can be represented by storing two skinny matrices and a diagonal that don't
take much more memory/disk than the original data.
Aren't the user/item feature matrices already a form of this?
I think the basic question was how to compute recommendations for a
particular user. You could just predict his preference for all items he
has not yet seen by multiplying the item features matrix with his user
feature vector to get the estimated preferences but you cannot to this
for all users at once, right?
So the question would be how to find the initial "candidate items" from
the item feature matrix.
Hope I didn't misunderstand this.
--sebastian
On Mon, Jul 18, 2011 at 12:17 AM, Sebastian Schelter (JIRA)<[email protected]
wrote:
The problem with this naive approach is that the resulting matrix is going
to be huge (millions of users times hundred thousands of items) and dense,
which makes it uncomputable.