Torch Wiki
개인적으로 Torch를 주력으로 사용하려합니다. 필요할 때마다 빠르게 검색하기 위해 모아놓은 것들을 wiki 처럼 공유합니다.
Official Homepage
- Installation is easy.
Tutorials
Very useful
ref when needed
Useful Libraries
- torchnet is a framework for torch which provides a set of abstractions aiming at encouraging code re-use as well as encouraging modular programming.
At the moment, torchnet provides four set of important classes:
Dataset: handling and pre-processing data in various ways.
Engine: training/testing machine learning algorithm.
Meter: meter performance or any other quantity.
Log: output performance or any other string to file / disk in a consistent manner.
For an overview of the torchnet framework, please also refer to this paper.
- Includes debugger (very useful)
Deepmind : Cephes Mathematical Funcions Library
- Provides and wraps the mathematical functions from the Cephes mathematical library, developed by Stephen L. Moshier. This C library provides a lot of mathematical functions. It is used, among many other places, at the heart of SciPy.
- compute gradient automatically
- similar to tensorflow, theano auto differentiation.
- Useful to design very wierd architectures.
- ref: Bayarea DL summmer
grad = require 'autograd'
grad.optimize(true) -- global
local df = grad(f, { optimize = true }) -- for this function only
local grads = df(params)
OptNet - reducing memory usage in torch neural networks
- reduce memory
- Web broswer
- Useful with SSH
- Almost only GUI with torch.
- socket based. Listening at certain port
- Developed from gfx.js : a browser-based graphics server
- Depracated. Can use terminal
From Torch.ch
Neural Network Package
This package provides an easy and modular way to build and train simple or complex neural networks using Torch:
- Modules are the bricks used to build neural networks. Each are themselves neural networks, but can be combined with other networks using containers to create complex neural networks:
- [Module]
(https://github.com/torch/nn/blob/master/doc/module.md#nn.Module): abstract class inherited by all modules; - Containers: container classes like
Sequential
,Parallel
andConcat
; - Transfer functions: non-linear functions like
Tanh
andSigmoid
; - Simple layers: like
Linear
,Mean
,Max
andReshape
; - Table layers: layers for manipulating
table
s likeSplitTable
,ConcatTable
andJoinTable
; - Convolution layers:
Temporal
,Spatial
andVolumetric
convolutions;
- [Module]
- Criterions compute a gradient according to a given loss function given an input and a target:
- Criterions: a list of all criterions, including
Criterion
, the abstract class; MSECriterion
: the Mean Squared Error criterion used for regression;ClassNLLCriterion
: the Negative Log Likelihood criterion used for classification;
- Criterions: a list of all criterions, including
- Additional documentation:
- Overview of the package essentials including modules, containers and training;
- Training: how to train a neural network using
StochasticGradient
; - Testing: how to test your modules.
- Experimental Modules: a package containing experimental modules and criteria.