I used to come across some algorithms whose running time is related to
the size of output. One kind of such algorithms is the pattern mining
algorithms in Data Mining area. For these algorithms, how far can they
run is related to the number of patterns existing in the input data.

The problem is how to calculate the time complexity of this type of
algorithms? I am just not used to do this. Any clue or point to
tutorials will be helpful to me. Thanks very much.


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Algorithm Geeks" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/algogeeks
-~----------~----~----~----~------~----~------~--~---

Reply via email to