That's just the thing - I'm not using the encog 3rd party library. I wrote "
*github.com/twashing/nn*" completely from scratch.

I did have the idea that maybe bid, ask and volume were not enough data
with which to make reliable predictions. So I'll also try adding 3, 5, and
7 tick running averages as input. But for each lifecycle, the error seems
to move up, then settle down at around 0.87, even for the next 1000 ticks
or so. Something in my weight update calculations, pushes the error up to
0.87 (or some other locality), and keeps it there, even going for 1000
ticks. So first, i) I wanted to make sure my math was correct. Then ii) I
wanted to ensure that I was correctly implementing the algorithm.

error progression

   1. *- 0.39489344469762966 ;; this error is a result of the initial
   randomized weights *
   2. - 0.8491601535927018
   3. - 0.8727499656138056
   4. - 0.870064689195726
   5. ...

Wrt to using a single brokerage's market data vs aggregate tick data...
aren't they one in the same? If we're both using the same exchange, I'd
assume I was getting the same market data as any other brokerage (allowing
for insiders, market makers, etc). I'd be interested in being educated
otherwise.

But in principle, this NN is just something that can be used to predict any
time series - plant growth, sunspots, etc. So that error progression should
still bother me, no?


Hmm

Tim Washington
Interruptsoftware.ca

"*Today a young man on acid realized that all matter is merely energy
condensed to a slow vibration, that we are all one consciousness
experiencing itself subjectively, there is no such thing as death, life is
only a dream, and we are the imagination of ourselves. Here's Tom with the
Weather.*" -- Bill Hicks



On Sun, Nov 4, 2012 at 6:49 PM, Dominic Cerisano <dceris...@gmail.com>wrote:

> Well darn it.
>
> Had a detailed response ready to go and found that GG does not save
> drafts.
> POS. Movin on.
>
> The upshot was I tried predicting tick data (see attachment) with a model
> pretty much identical to the one given here with little real success.
> It would only resolve if I removed all of the discontinuities in the data
> - that came to over 20% of the data)
> The resulting "fantasy" set failed to predict trends that were not in the
> set with (15% success rate maximum)
>
> The conclusion I came to is that actual tick data (which is aggregate of
> all trading) is not predictable with the given model.
> NNs (certainly backprop) just try to spline a curve through a training set
> of input/ouput exemplars.
> Tick data simply does not seem to provide such a curve. Equivalent to
> stating what is obvious - stock markets are unpredictable from moment to
> moment).
>
> However one approach that occurred to me was rather than using aggregate
> tick data, rather use historical data from a given single brokerage that is
> known to use automated trading.
>
> Rather than trying to learn a dog-breakfast of influences given by
> aggregate data, non-aggregated direct data coming direct from a automated
> trading algo should prove to be a likely subject for machine-learning.
>
> However, if John Nash was correct then these systems would deliberately
> make bad trades in order to be less predictable. Can't do too much of that
> though :)
>
> There are plenty of examples in Java of backprop. It is well known, and
> since you are using a known library I doubt there are errors in it, other
> than sub-optimal coding.
>
> Historical FIX data (non-aggregated direct trading transactions) is
> generally not published anywhere. The model being explored here would be
> very useful if it were.
>
> Cheers!
>
> Dominic Cerisano
>
>
>
>
>
>
>
>
>
>
> On Sunday, August 5, 2012 2:27:35 PM UTC-4, frye wrote:
>>
>> Hey all,
>>
>> This post is a fork of a thread in the post "community interest in
>> machine 
>> learning<https://groups.google.com/forum/?fromgroups#!topic/clojure/heBrnBuUGqs>".
>> Some of us were starting to take a deep dive into 
>> clojure-encog<https://github.com/jimpil/clojure-encog> and
>> I thought it would be a good idea to have a new thread for that.
>>
>> So I took a look at the way 
>> encog-java<https://github.com/encog/encog-java-core>
>>  (what clojure-encog <https://github.com/jimpil/clojure-encog>** wraps)
>> loads tick data into it's system. There is a 
>> YahooFinanceLoader<https://github.com/encog/encog-java-core/blob/master/src/main/java/org/encog/ml/data/market/loader/YahooFinanceLoader.java>that
>>  pulls csv data from a URL. But it assumes that prices only have a
>> daily granularity. Now, the encog-java system seems to have the concept of
>> granularity going down to the second (see 
>> here<https://github.com/encog/encog-java-core/blob/master/src/main/java/org/encog/util/time/TimeUnit.java>).
>>  But
>> all of it's market loaders and list of ticks, seem to stop at a time
>> granularity of daily. See the LoadedMarketData 
>> source<https://github.com/encog/encog-java-core/blob/master/src/main/java/org/encog/ml/data/market/loader/LoadedMarketData.java>,
>> which uses a daily-biased 
>> MarketDataType<https://github.com/encog/encog-java-core/blob/master/src/main/java/org/encog/ml/data/market/MarketDataType.java>.
>>  Obviously,
>> that's not enough if we want to calculate on a second or sub-second
>> interval. Ultimately the YahooFinanceLoader will give us a list of
>> LoadedMarketData, which assumes daily price ticks.
>>
>> What I need to know is can I give the encog neural net a list of tick
>> data that has second or sub-second intervals? Back over the clojure-encog,
>> the thing that normalizes input data, the 
>> make-data<https://github.com/jimpil/clojure-encog/blob/master/src/clojure_encog/training.clj#L41>function
>>  only deals with doubles (not a list of tick data entries). The make-trainer
>> and 
>> train<https://github.com/jimpil/clojure-encog/blob/master/src/clojure_encog/training.clj#L137>functions
>>  seem to iterate for the number of strategies that you've
>> specified. But I can't see in 
>> BackPropogation<https://github.com/encog/encog-java-core/blob/master/src/main/java/org/encog/neural/networks/training/propagation/back/Backpropagation.java>or
>>  it's
>> superclass<https://github.com/encog/encog-java-core/blob/master/src/main/java/org/encog/neural/networks/training/propagation/Propagation.java>,
>> where that tick data is actually processed (init and iteration methods seem
>> to just setup a background process). So I'm left wondering how I can give
>> the core encog neural-net a list of tick data that has a second or
>> sub-second granularity?
>>
>>
>> Hmmm
>>
>> Tim Washington
>> Interruptsoftware.ca
>> 416.843.9060
>>
>>   --
> You received this message because you are subscribed to the Google
> Groups "Clojure" group.
> To post to this group, send email to clojure@googlegroups.com
> Note that posts from new members are moderated - please be patient with
> your first post.
> To unsubscribe from this group, send email to
> clojure+unsubscr...@googlegroups.com
> For more options, visit this group at
> http://groups.google.com/group/clojure?hl=en
>

-- 
You received this message because you are subscribed to the Google
Groups "Clojure" group.
To post to this group, send email to clojure@googlegroups.com
Note that posts from new members are moderated - please be patient with your 
first post.
To unsubscribe from this group, send email to
clojure+unsubscr...@googlegroups.com
For more options, visit this group at
http://groups.google.com/group/clojure?hl=en

Reply via email to