Torch vs Theano

Torch vs Theano

Recently we took a look at Torch 7 and found its data ingestion facilities less than impressive. Torch’s biggest competitor seems to be Theano, a popular deep-learning framework for Python.

 

 

 

It seems that these two have been having “who is faster” competition going for a few years now. It’s been documented in the following papers:

  1. J. Bergstra, O. Breuleux, F. Bastien, P. Lamblin, R. Pascanu, G. Desjardins, J. Turian, Y. Bengio - Theano: a CPU and GPU Math Expression Compiler PDF

  2. Ronan Collobert, Koray Kavukcuoglu, Clement Farabet - Torch7: A Matlab-like Environment for Machine Learning PDF

  3. Frédéric Bastien, Pascal Lamblin, Razvan Pascanu, James Bergstra, Ian Goodfellow, Arnaud Bergeron, Nicolas Bouchard, David Warde-Farley, Yoshua Bengio - Theano: new features and speed improvements arxiv


A figure from the Torch7 paper [2]. Torch - red, Theano - green. Higher is better.

And a quote from [3]:

Bergstra et al.(2010) showed that Theano was faster than many other tools available at the time, including Torch5. The following year, Collobert et al.(2011) showed that Torch7 was faster than Theano on the same benchmarks.

The results in the last paper are mixed, if you’re wondering.

The latest act in this friendly competition, which can be seen as one between Bengio’s and LeCun’s groups, appears to be about FFT convolutions, first available in Theano and recently open-sourced by Facebook in Torch.

As a side note, the press really jumped at this second event with headlines about turbo-charging deep learning and the like. Probably the allure of Facebook and deep learning in the same sentence.

Let’s look at convnet benchmarks by Soumith Chintala. He is a Facebook/Torch guy and yet the Theano’s convolution layer is reported to be the fastest at the time of writing. Waiting for those fbfft results.

Anyway, speed isn’t everything and there’s more to life than FFT convolutions. From a developer’s perspective minor differences in speed are less important than other factors, like ease of use. Which leads us to what Soumith had to say about Torch, according to VentureBeat:

It’s like building some kind of electronic contraption or, like, a Lego set. You just can plug in and plug out all these blocks that have different dynamics and that have complex algorithms within them.

At the same time Torch is actually not extremely difficult to learn — unlike, say, the Theano library.

We’ve made it incredibly easy to use. We introduce someone to Torch, and they start churning out research really fast.

Well, you already know our opinion about the “incredibly easy” bit. Torch is not really a Matlab-like environment. Matlab, with all its shortcomings, is a very well polished piece of software with examplary documentation. Torch, on the other hand, is rather rough around the edges.

Besides the language gap, that’s one of the reasons that you don’t see that much Torch usage apart from Facebook and DeepMind. At the same time libraries using Theano have been springing up like mushrooms after a rain (you might want to take a look at Sander Dieleman’s Lasagne and at blocks). It is hard to beat the familiar and rich Python ecosystem.

Theano tutorials

P.S. What about Caffe?

Caffe is a fine and very popular piece of software. How does it compare with Torch and Theano? Here’s sieisteinmodel’s answer from Reddit:

Caffe has a pretty different target. More mass market, for people who want to use deep learning for applications. Torch and Theano are more tailored towards people who want to use it for research on DL itself.

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值