On Thu, Aug 11, 2016 at 3:08 AM, Tomas Lycken <[email protected]>
wrote:
> Late to the party, but what’s wrong with writing FastArray{1,10,1,10} (or
> even FastArray{10,10} if you’re OK with implicit 1-based indexing)? It
> seems that those (valid) type arguments could convey just as much
> information as FastArray(1:10, 1:10), and you could then handle any
> special-casing on the size in the constructor. That way you’d only need one
> (generic) type. Or am I missing something important here?
>
The package supports many other cases as well. Both lower or upper index
can be either fixed in the type, or left flexible; depending on this, the
optimal way to access array elements is determined. Also, the number of
dimensions can change, so that there is `FastArray(1:10)` as well as
`FastArray(1:10,1:10,1:10,1:10)`. Julia's parametric types are not generic
enough to support this.
One example might be a `(3,n)` array, where the first dimension is known to
have size `3`, and the second can be arbitrary. You write `typealias
array3n FastArray(3,:)`, and later create arrays of this type e.g. via
`array3n{Float64}(:,100)`. At run time, this leads to better (faster) code
when accessing array elements in my use case.
-erik
> // T
>
> On Thursday, August 11, 2016 at 12:50:14 AM UTC+2, Erik Schnetter wrote:
>
> The upshot of the discussions seems to be "it won't work in 0.5 because
>> the experts say so, and there are no plans to change that". So I'm going to
>> accept that statement.
>>
>> I think I'll use the following work-around:
>> ```Julia
>> immutable Wrapper{Tag,Types}
>> data::Types
>> end
>> ```
>> where I use `Tag` as `Val{Symbol}` to generate many distinct types, and
>> `Types` will be a tuple. This allows me to "generate" any immutable type. I
>> won't be able to access fields via the `.fieldname` syntax, but that is not
>> important to me.
>>
>> -erik
>>
>>
>>
>> On Wed, Aug 10, 2016 at 6:09 PM, Tim Holy <[email protected]> wrote:
>>
>>> AFAICT, it remains possible to do dynamic type generation if you (1)
>>> print the
>>> code that would define the type to a file, and (2) `include` the file.
>>>
>>> function create_type_dynamically{T}(::Type{T})
>>> type_name = string("MyType", T)
>>> isdefined(Main, Symbol(type_name)) && return nothing
>>> filename = joinpath(tempdir(), string(T))
>>> open(filename, "w") do io
>>> println(io, """
>>> type $type_name
>>> val::$T
>>> end
>>> """)
>>> end
>>> eval(include(filename))
>>> nothing
>>> end
>>>
>>> Is this somehow less evil than doing it in a generated function?
>>>
>>> Best,
>>> --Tim
>>>
>>> On Wednesday, August 10, 2016 9:49:23 PM CDT Jameson Nash wrote:
>>> > > Why is it impossible to generate a new type at run time? I surely
>>> can do
>>> >
>>> > this by calling `eval` at module scope.
>>> >
>>> > module scope is compile time != runtime
>>> >
>>> > > Or I could create a type via a macro.
>>> >
>>> > Again, compile time != runtime
>>> >
>>> > > Given this, I can also call `eval` in a function, if I ensure the
>>> >
>>> > function is called only once.
>>> >
>>> > > Note that I've been doing this in Julia 0.4 without any (apparent)
>>> >
>>> > problems.
>>> >
>>> > Sure, I'm just here to tell you why it won't work that way in v0.5
>>> >
>>> > > I'm not defining thousands of types in my code. I define one type,
>>> and
>>> >
>>> > use it all over the place. However, each time my code runs (for
>>> days!), it
>>> > defines a different type, chosen by a set of user parameters. I'm also
>>> not
>>> > adding constraints to type parameters -- the type parameters are just
>>> `Int`
>>> > values.
>>> >
>>> > Right, the basic tradeoff required here is that you just need to
>>> provide a
>>> > convenient way for your user to declare the type at the toplevel that
>>> will
>>> > be used for the run. For example, you can just JIT the code for the
>>> whole
>>> > run at the beginning:
>>> >
>>> > function do_run()
>>> > return @eval begin
>>> > lots of function definitions
>>> > do_work()
>>> > end
>>> > end
>>> >
>>> > On Wed, Aug 10, 2016 at 5:14 PM Erik Schnetter <[email protected]>
>>> wrote:
>>> > > On Wed, Aug 10, 2016 at 1:45 PM, Jameson <[email protected]> wrote:
>>> > >> AFAIK, defining an arbitrary new type at runtime is impossible,
>>> sorry. In
>>> > >> v0.4 it was allowed, because we hoped that people understood not to
>>> try.
>>> > >> See also https://github.com/JuliaLang/julia/issues/16806. Note
>>> that it
>>> > >> is insufficient to "handle" the repeat calling via caching in a
>>> Dict or
>>> > >> similar such mechanism. It must always compute the exact final
>>> output
>>> > >> from
>>> > >> the input values alone (e.g. it must truly be const pure).
>>> > >
>>> > > The generated function first calculates the name of the type, then
>>> checks
>>> > > (`isdefined`) if this type is defined, and if so, returns it.
>>> Otherwise it
>>> > > is defined and then returned. This corresponds to looking up the
>>> type via
>>> > > `eval(typename)` (a symbol). I assume this is as pure as it gets.
>>> > >
>>> > > Why is it impossible to generate a new type at run time? I surely
>>> can do
>>> > > this by calling `eval` at module scope. Or I could create a type via
>>> a
>>> > > macro. Given this, I can also call `eval` in a function, if I ensure
>>> the
>>> > > function is called only once. Note that I've been doing this in
>>> Julia 0.4
>>> > > without any (apparent) problems.
>>> > >
>>> > > Being able to define types with arbitrary constraints in the type
>>> > >
>>> > >> parameters works OK for toy demos, but it's intentionally rather
>>> > >> difficult
>>> > >> since it causes performance issues at scale. Operations on Array are
>>> > >> likely
>>> > >> to be much faster (including the allocation) than on Tuple (due to
>>> the
>>> > >> cost
>>> > >> of *not* allocating) unless that Tuple is very small.
>>> > >
>>> > > I'm not defining thousands of types in my code. I define one type,
>>> and use
>>> > > it all over the place. However, each time my code runs (for days!),
>>> it
>>> > > defines a different type, chosen by a set of user parameters. I'm
>>> also not
>>> > > adding constraints to type parameters -- the type parameters are just
>>> > > `Int`
>>> > > values.
>>> > >
>>> > > And yes, I am using a mutable `Vector{T}` as underlying storage,
>>> that's
>>> > > not the issue here. The speedup comes from knowing the size of the
>>> array
>>> > > ahead of time, which allows the compiler to optimize indexing
>>> expressions.
>>> > > I've benchmarked it, and examined the generated machine code.
>>> There's no
>>> > > doubt that generating a type is the "right thing" to do in this case.
>>> > >
>>> > > -erik
>>> > >
>>> > > On Wednesday, August 10, 2016 at 1:25:15 PM UTC-4, Erik Schnetter
>>> wrote:
>>> > >>> I want to create a type, and need more flexibility than Julia's
>>> `type`
>>> > >>> definitions offer (see <https://github.com/eschnett/FastArrays.jl
>>> >).
>>> > >>> Currently, I have a function that generates the type, and returns
>>> the
>>> > >>> type.
>>> > >>>
>>> > >>> I would like to make this a generated function (as it was in Julia
>>> 0.4).
>>> > >>> The advantage is that this leads to type stability: The generated
>>> type
>>> > >>> only
>>> > >>> depends on the types of the arguments pass to the function, and
>>> Julia
>>> > >>> would
>>> > >>> be able to infer the type.
>>> > >>>
>>> > >>> In practice, this looks like
>>> > >>>
>>> > >>> using FastArrays
>>> > >>> # A (10x10) fixed-size arraytypealias Arr2d_10x10 FastArray(1:10,
>>> 1:10)
>>> > >>> a2 = Arr2d_10x10{Float64}(:,:)
>>> > >>>
>>> > >>>
>>> > >>> In principle I'd like to write `FastArray{1:10, 1:10}` (with curly
>>> > >>> braces), but Julia doesn't offer sufficient flexibility for this.
>>> Hence
>>> > >>> I
>>> > >>> use a regular function.
>>> > >>>
>>> > >>> To generate the type in the function I need to call `eval`. (Yes,
>>> I'm
>>> > >>> aware that the function might be called multiple times, and I'm
>>> handling
>>> > >>> this.)
>>> > >>>
>>> > >>> Do you have a suggestion for a different solution?
>>> > >>>
>>> > >>> -erik
>>> > >>>
>>> > >>> On Wed, Aug 10, 2016 at 11:51 AM, Jameson <[email protected]>
>>> wrote:
>>> > >>>> It is tracking the dynamic scope of the code generator, it
>>> doesn't care
>>> > >>>> about what code you emit. The generator function must not cause
>>> any
>>> > >>>> side-effects and must be entirely computed from the types of the
>>> inputs
>>> > >>>> and
>>> > >>>> not other global state. Over time, these conditions are likely to
>>> be
>>> > >>>> more
>>> > >>>> accurately enforced, as needed to make various optimizations
>>> reliable
>>> > >>>> and/or correct.
>>> > >>>>
>>> > >>>>
>>> > >>>>
>>> > >>>> On Wednesday, August 10, 2016 at 10:48:31 AM UTC-4, Erik Schnetter
>>> > >>>>
>>> > >>>> wrote:
>>> > >>>>> I'm encountering the error "eval cannot be used in a generated
>>> > >>>>> function" in Julia 0.5 for code that is working in Julia 0.4. My
>>> > >>>>> question
>>> > >>>>> is -- what exactly is now disallowed? For example, if a generated
>>> > >>>>> function
>>> > >>>>> `f` calls another (non-generated) function `g`, can `g` then call
>>> > >>>>> `eval`?
>>> > >>>>> Does the word "in" here refer to the code that is generated by
>>> the
>>> > >>>>> generated function, or does it refer to the dynamical scope of
>>> the
>>> > >>>>> code
>>> > >>>>> generation state of the generated function?
>>> > >>>>>
>>> > >>>>> To avoid the error I have to redesign my code, and I'd like to
>>> know
>>> > >>>>> ahead of time what to avoid. A Google search only turned up the
>>> C file
>>> > >>>>> within Julia that emits the respective error message, as well as
>>> the
>>> > >>>>> Travis
>>> > >>>>> build log for my package.
>>> > >>>>>
>>> > >>>>> -erik
>>> > >>>>>
>>> > >>>>> --
>>> > >>>>> Erik Schnetter <[email protected]>
>>> > >>>>> http://www.perimeterinstitute.ca/personal/eschnetter/
>>> > >>>
>>> > >>> --
>>> > >>> Erik Schnetter <[email protected]>
>>> > >>> http://www.perimeterinstitute.ca/personal/eschnetter/
>>> > >
>>> > > --
>>> > > Erik Schnetter <[email protected]>
>>> > > http://www.perimeterinstitute.ca/personal/eschnetter/
>>>
>>>
>>>
>>
>>
>> --
>> Erik Schnetter <[email protected]> http://www.perimeterinstitute.
>> ca/personal/eschnetter/
>>
>
>
--
Erik Schnetter <[email protected]>
http://www.perimeterinstitute.ca/personal/eschnetter/