Complexity (or more broadly, resource requirements, which encompass aspects beyond compute cycles like memory and memory bandwidth) should be a fundamental part of the evaluation methodology. But there are many pitfalls to avoid here.
Intrinsic algorithmic complexity is difficult to discern with any precision, since any metric necessarily depends on the level of implementation optimization (from simple loop unrolling to vector intrinsics or full assembly). Optimization can hamper experimentation, so it is usually not a good idea in the research stage. Configuration points are also important to consider. Having configuration points that are impractical for real-world use is still valuable for steering the research. This should not be the primary evaluation criteria, which should attempt to focus on real-world use. But such configuration points should be allowed and analyzed. Mo On 3/23/15, 8:40 AM, Harald Alvestrand <[email protected]> wrote: I also wonder a bit about how we should treat evaluation of performance - a traditional video codec development technique has been to develop a codec that takes hours-per-second to encode a video, and then seek to optimize it by a factor of a hundred before shipping it. But sometimes, there's just limits to how much one can achieve. How should we treat that aspect in evaluation? _______________________________________________ video-codec mailing list [email protected] https://www.ietf.org/mailman/listinfo/video-codec
