Blog
The OCaml Planet
Articles and videos contributed by both experts, companies and passionate developers from the OCaml community. From in-depth technical articles, project highlights, community news, or insights into Open Source projects, the OCaml Planet RSS feed aggregator has something for everyone.
Want your Blog Posts or Videos to Show Here?
To contribute a blog post, or add your RSS feed, check out the Contributing Guide on GitHub.
At Jane Street, we enjoy using OCaml for lots of different things, from FPGA designs to web development. When it comes to Machine Learning, Python is one of the most commonly used languages. Machine learning frameworks such as TensorFlow or PyTorch wrap some highly efficient C++ and GPU implementations of tensor operations in easy to use Python apis. These frameworks also provide automatic differentiation functionalities which are commonly used to train deep learning models. In this talk we see how we can leverage TensorFlow or PyTorch directly from OCaml so that we can use our favorite programming language to build deep learning models and train them on GPUs. We will consider the Reinforcement Learning setting where an agent is trained to play Atari video games such as Space Invaders or Breakout. Our agents will be based on the Deep Q-Learning approach introduced in 2014. Laurent Mazare Laurent first joined Jane Street as a developer in the London office back in 2013 working on trading systems. After a short stint at DeepMind in 2017/2018, he is now back at Jane Street as a researcher working on the equities desk in London. Laurent holds a PhD in theoretical computer science from Institut National Polytechnique de Grenoble.
At Jane Street, we enjoy using OCaml for lots of different things, from FPGA designs to web development. When it comes to Machine Learning, Python is one of the most commonly used languages. Machine learning frameworks such as TensorFlow or PyTorch wrap some highly efficient C++ and GPU implementations of tensor operations in easy to use Python apis. These frameworks also provide automatic differentiation functionalities which are commonly used to train deep learning models. In this talk we see how we can leverage TensorFlow or PyTorch directly from OCaml so that we can use our favorite programming language to build deep learning models and train them on GPUs. We will consider the Reinforcement Learning setting where an agent is trained to play Atari video games such as Space Invaders or Breakout. Our agents will be based on the Deep Q-Learning approach introduced in 2014. Laurent Mazare Laurent first joined Jane Street as a developer in the London office back in 2013 working on trading systems. After a short stint at DeepMind in 2017/2018, he is now back at Jane Street as a researcher working on the equities desk in London. Laurent holds a PhD in theoretical computer science from Institut National Polytechnique de Grenoble.
Presented by Anil Madhavapeddy (@avsm) We keep being told ReasonML can compile to native and do interop with OCaml, and that there's a wealth of existing code and tools we can draw into our applications - but how do we get there? This video will ...
Most of the time, our relationship to programming languages is somewhat remote; we depend on the arcane details of the languages we use, but we don’t usually have much of a say in how those languages evolve. At Jane Street, we started out in that mode, as a mere user of the language. But over the last 15 years, we’ve moved to a more active stance, where today, we have a team of compiler devs who actively contribute to OCaml, and where we’re more deeply involved in figuring out the future direction of the language. In this talk, we discuss that history, touching on how upstream changes impacted us along the way, how we came to start making changes ourselves, and what ongoing projects we’re excited about. Presented by Yaron Minsky Yaron Minsky joined Jane Street back in 2002, and claims the dubious honor of having convinced the firm to start using OCaml. He also spends way too much time teaching his kids how to program.
Most of the time, our relationship to programming languages is somewhat remote; we depend on the arcane details of the languages we use, but we don’t usually have much of a say in how those languages evolve. At Jane Street, we started out in that mode, as a mere user of the language. But over the last 15 years, we’ve moved to a more active stance, where today, we have a team of compiler devs who actively contribute to OCaml, and where we’re more deeply involved in figuring out the future direction of the language. In this talk, we discuss that history, touching on how upstream changes impacted us along the way, how we came to start making changes ourselves, and what ongoing projects we’re excited about. Presented by Yaron Minsky Yaron Minsky joined Jane Street back in 2002, and claims the dubious honor of having convinced the firm to start using OCaml. He also spends way too much time teaching his kids how to program.
Speaker: Hongbo Zhang
Speaker: Charles Chamberlain
Speaker: Frédéric Bour
Speaker: David Allsopp
Did you know that Jane Street uses OCaml for, like, everything? Did you also know that Jane Street builds FPGA designs? A problem? Come and find out how we design and test our FPGAs. We'll have some fun (or terrible disasters) with some demos on the Arty A7 hobbyist FPGA board, with the design expressed using HardCaml, an OCaml library for creating hardware designs, and driven by an embedded software stack written in OCaml and using ports of your favorite Jane Street libraries. I'll round up with some thoughts on the pros and cons of writing hardware in OCaml, and talk about some ideas we would like to explore to make the process more productive in the future. Presented by: Andy Ray Andy has been designing IP cores for nearly 20 years mainly in the areas of networking and video coding. Frustration with standard RTL development processes led him to develop the HardCaml suite of hardware design tools in OCaml. Then one day while down at the pub he got an email from Jane Street wondering about some sort of collaboration, and the rest is history.