On Monday, 22 December 2014 at 12:24:52 UTC, Laeeth Isharc wrote:

In case it wasn't obvious from the discussion that followed: finance is a broad field with many different kinds of creature within, and there are different kinds of problems faced by different participants.

High Frequency Trading has peculiar requirements (relating to latency, amongst other things) that will not necessarily be representative of other areas. Even within this area there is a difference between the needs of a Citadel in its option marketmaking activity versus the activity of a pure delta HFT player (although they also overlap).

A JP Morgan that needs to be able to price and calculate risk for large portfolios of convex instruments in its vanilla and exotic options books has different requirements, again.

You would typically use Monte Carlo (or quasi MC) to price more complex products for which there is not a good analytical approximation. (Or to deal with the fact that volatility is not constant). So that fits very much with the needs of large banks - and perhaps some hedge funds - but I don't think a typical HFT guy would be all that interested to know about this. They are different domains.

Quant/CTA funds also have decent computational requirements, but these are not necessarily high frequency. Winton Capital, for example, is one of the larger hedge funds in Europe by assets, but they have talked publicly about emphasizing longer-term horizons because even in liquid markets there simply is not the liquidity to turn over the volume they would need to to make an impact on their returns. In this case, whilst execution is always important, the research side of things is where the value gets created. And its not unusual to have quant funds where every portfolio manager also programs. (I will not mention names). One might think that rapid iteration here could have value.

http://www.efinancialcareers.co.uk/jobs-UK-London-Senior_Data_Scientist_-_Quant_Hedge_Fund.id00654869

Fwiw having spoken to a few people the past few weeks, I am struck by how hollowed-out front office has become, both within banks and hedge funds. It's a nice business when things go well, but there is tremendous operating leverage, and if one builds up fixed costs then losing assets under management and having a poor period of performance (which is part of the game, not necessarily a sign of failure) can quickly mean that you cannot pay people (more than salaries) - which hurts morale and means you risk losing your best people.

So people have responded by paring down quant/research support to producing roles, even when that makes no sense. (Programmers are not expensive). In that environment, D may offer attractive productivity without sacrificing performance.

I agree with most of these points.

For some reason, people often relate quant finance / high frequency trading with one of the two: either ultra-low-latency execution or option pricing, which is just wrong. In most likelihood, the execution is performed on FPGA co-located grids, so that part is out of question; and options trading is just one of so many things hedge funds do. What takes the most time and effort is the usual "data science" (which in many cases boil down to data munging), as in, managing huge amounts of raw structured/unstructured high-frequency data; extracting the valuable information and learning strategies; implementing fast/efficient backtesting frameworks, simulators etc. The need for "efficiency" here naturally comes from the fact that a typical task in the pipeline requires dozens/hundreds GB of RAM and dozens of hours of runtime on a high-grade box (so noone would really care if that GC is going to stop the world for 0.05 seconds).

In this light, as I see it, D's main advantage is a high "runtime-efficiency / time-to-deploy" ratio (whereas one of the main disadvantages for practitioners would be the lack of standard tools for working with structured multidimensional data + linalg, something like numpy or pandas).

Cheers.

Reply via email to