# staticfloat

http://www.reddit.com/user/staticfloat

staticfloat56 karma

To illustrate this a bit, if you were to write a method such as:

``` function mysum(data) accumulator = 0.0 for x in data accumulator += x end return accumulator end ```

The method is written generically enough that there is no mention of what `data` is; we are simply assuming that whatever is passed in to `mysum` will be iterable. When my program calls `mysum(data)`, if `data` is a `Vector{Float64}`, then a version of this method will be generated that does floating-point addition in the `+=` method. If I call it with `data` as a user-defined `LinkedList` that contains arbitrary-precision Rational numbers, then a different version of this method will be compiled to do the appropriate iteration and summation.

When this method is used by calling code, which underlying `mysum` implementation gets used is up to the dynamic dispatch implementation, and thanks to type inference, this is often statically decidable. Example:

``` function do_work() data = randn(1024) .* 100 # This gives me a Vector{Float64} data = round.(Int64, data) # This gives me a Vector{Int64} return mysum(data) # Calls the Vector{Int64} specialization end ```

In this case, when I call `do_work()`, the compiler is able to propagate the types through the computation, figuring out where it knows without a doubt which method to call. In this case, it can call the `mysum(x::Vector{Int64})` specialization. If the compiler cannot figure out ahead of time what the types are that are being passed in to your method, it will have to do dispatch at runtime, inspecting the concrete types of the parameters before it looks up which method to call in the method table.

So you can see from this how there is no performance impact at all in not specifying the types; either the compiler knows ahead of time what the arguments are and can jump directly to the appropriate method, or it doesn't know and must do a dynamic lookup. If you were to annotate your method with types that wouldn't help the compiler to know what the types at the call site are; you would instead need to add type assertions at the call site (Which is occasionally helpful in tough-to-infer locations).

As an aside on the expressiveness of parameter type annotations, my `mysum()` method commits the sin of assuming that the accumulator variable should start its existence as a `Float64`. Realistically, it should start life as the zero type of whatever is contained within `data`. If I were to restrict this to only work on `Vector` types, I could do this easily with parametric types:

``` function mysum(data::Vector{T}) where {T} accumulator = T(0) ... end ```

However this would now not work very well for things like the hypothetically-user-defined `LinkedList` type. Instead, we can use the `eltype()` method, which returns the element type of any container object given to it (and which the user can themselves extend for their `LinkedList` type):

``` function mysum(data) accumulator = eltype(data)(0) ... end ```

The beauty of this is that `eltype()`, just like `mysum()` itself, is going to reap the benefits of type inference and most likely be completely optimized out of the entire function. An Example implementation of `eltype()` is the simple one-liner:

``` eltype(x::Vector{T}) where {T} = T ``` Which simply returns `Float64` for an input of type `Vector{Float64}`, and since the compiler can determine that `mysum()` itself was called with that `Float64`, we have a very well-tailored summation function that is now extensible to all sorts of types and does not have any performance issues despite its extensibility.

staticfloat5 karma

To add to @loladiro's sibling answer, I would also add that Julia lies within this strange "in-between" place, where we have both theoretical CS types and applications types collaborating, and the lines begin to blur as the two camps learn from eachother. The contributors who have worked on the Julia compiler originate from a very wide range of technical disciplines such as Aeronautical Engineering, Chemistry, Physics, Electrical Engineering, and yes, Computer Science. I think the most important thing in teams like this is to do your best to put your ego to death and be willing to consider alternative viewpoints if they will, in the end, result in a better language. As long as people can keep working together without too much friction, a wide variety of backgrounds can help to grow new and interesting ideas.

staticfloat5 karma

JuliaCon 2020 is going on right now so there's lots of good content coming out right now that you can enjoy. For those of you that are new to Julia, I suggest checking out the workshop from a few days ago, Learn Julia via Epidemic Modeling by the excellent David Sanders.