Language #30: Julia

Published on:

I recently ported a mathematical model of Arabidopsis thaliana from MATLAB to Julia as part of my MSc project. I had some, distant, experience with MATLAB from when I worked in aerospace but Julia was a new language to me and one that had been on my list to try for a while.

Julia has a certain cachet in numerical and scientific programming. It advertises itself as being a fast & friendly programming language, aiming to solve what the language's developers identify as "the two language problem". The "two language problem" states that one may have a friendly but slow language or an unfriendly but fast language and that never the twain shall meet. Personally I don't believe this problem is a genuine one, but for researchers focussed on the science it's an appealing idea and proposition.

Julia is primarily a functional programming language ultimately implemented atop of LLVM . It has been described as a "post-object-oriented language" which is kind of meaningless term but provides cover for the language documentation to recycle terms from object-oriented programming in arguably misleading ways. Julia describes itself as dynamically typed although I think it would be more accurate to say it's statically typed but leans heavily on type inference to create the appearance of dynamic typing. There are some rough edges to the type system especially around the assigning default values to typed function arguments. Overall, the language is OK to work with once you learn that it's not great at detecting syntax errors and pretty much all other errors tend to look like a missing function definition to the compiler. It's not as friendly as Python but lacks the magic punctuation syntax of C++ .

Julia 's performance is harder to quantify. When running everything from the REPL the Julia model was about twice as fast as its MATLAB sibling. Run as a script from the command-line, the MATLAB model was easily twice as fast as the Julia model. Julia is often described as being JIT'd but I think it's more accurate to say that it uses last-minute compilation. Whenever a
Julia process starts it compiles the standard library and this has a significant performance hit. When running the model as a script about two-thirds of the overall run time was spent on this last minute compilation stage. There didn't appear to be any attempt to cache what has been compiled so running a script always loses a significant chunk of time this start-up compilation activity.

Where Julia is at its weakest is the tooling. Currently the choice of development environments is VSCode or your favourite text editor + REPL. I found that the VSCode plugin for Julia didn't integrate well with the language's debugger. There seemed to be issues with the out of the box configuration of the plugin and even when fixed it could take hours to reach a break point. In the end use of the debugger was abandoned, it just wasn't useful.

The Julia community seemed both insular and bipolar. Whilst there were plenty of helpful exchanges in the language's discourse forums there seemed to be an equal amount of users who couldn't tolerate the idea that there might be flaws in the language and would gaslight people asking for help.

Julia has some outstanding libraries for numerical computing which makes it a strong candidate for this kind of application. I think its claims to be a general purpose programming language is a bit of a reach. There are more capable better supported languages for the general applications. The hyperbole and mis-description of what the language's capabilities and its functionality is a bit of turn off personally but it's very Silicon Valley "reality distortion field". Arguably there's still a lot of rough edges for a language over a decade old and there's a general feel of stagnation and missed opportunity.