BT

Facebook Open Sources Modules for Faster Deep Learning on Torch

| by Abel Avram Follow 7 Followers on Jan 20, 2015. Estimated reading time: 1 minute |

Facebook has open sourced a number of modules for faster training of neural networks on Torch.

Not long after Nvidia released cuDNN, a CUDA-based library for deep neural networks, Facebook’s AI Research laboratory (FAIR) has released for public use a number of modules for Torch, collectively called fbcunn and “significantly faster than the default ones.” The modules target mainly convolutional nets, are optimized for GPUs and are built on Nvidia’s cuFFT library. The package contains:

  • Spatial convolution modules using FFT to accelerate convolutions
  • Containers for parallelizing both the data and model training on multiple GPUs
  • Wrappers for FFT/IFFT 
  • A faster temporal convolutional layer (1.5x to 10x faster than cuDNN)
  • Lookup table for neural language models and word embedding
  • Hierarchical SoftMax module for training using very large number of classes

Facebook has built these modules based on ideas coming out of the paper Fast Training of Convolutional Networks through FFTs, co-authored by Yann LeCun, director of FAIR. According to the release notes, fbcunn provides up to 1.84x speed improvement on small kernel sizes (3x3) over cuDNN and up to 23.5x faster for large kernel sizes (5x5).

One of the first applications of Torch and fbcunn is faster image recognition, an example being classifying objects found in 1.2M pictures from ImageNet.

Rate this Article

Adoption Stage
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread
Community comments

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Discuss

Login to InfoQ to interact with what matters most to you.


Recover your password...

Follow

Follow your favorite topics and editors

Quick overview of most important highlights in the industry and on the site.

Like

More signal, less noise

Build your own feed by choosing topics you want to read about and editors you want to hear from.

Notifications

Stay up-to-date

Set up your notifications and don't miss out on content that matters to you

BT