Skip to content
This repository has been archived by the owner on Nov 1, 2021. It is now read-only.

Is it possible to segment the definition of the function to be taken the gradient? #118

Open
dlmacedo opened this issue May 9, 2016 · 7 comments

Comments

@dlmacedo
Copy link

dlmacedo commented May 9, 2016

Dear friend,

Is it possible to segment the definition of the function to be taken the gradient?

For example:

if x < SomeValue then f(x) = someThing
if x > SomeValue then f(x) = someOtherThing

Thanks,

David

@dlmacedo dlmacedo changed the title Is it possible to segment the definition of the function to be taken the gradient Is it possible to segment the definition of the function to be taken the gradient? May 9, 2016
@alexbw
Copy link
Collaborator

alexbw commented May 9, 2016

Conditionals are allowed in non-optimized mode (which is the default)

On Mon, May 9, 2016 at 5:20 PM David Lopes Macedo notifications@github.com
wrote:

Dear friend,

Is it possible to segment the definition of the function to be taken the
gradient?

For examplo,
if x < SomeValue then f(x) = someThing
if x < SomeValue then f(x) = someThing


You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub
#118

@dlmacedo dlmacedo reopened this May 12, 2016
@dlmacedo
Copy link
Author

dlmacedo commented May 12, 2016

Dear friends,

Could someone tells me what is the error bellow:

local SeLU = function(input)
  local output
  if torch.le(input, 0) then
    output = 0
  else
    output = input
  end 
  return output
end

I am using cudnn and getting this error:

/home/dlm/torch/install/bin/luajit: /home/dlm/torch/install/share/lua/5.1/nn/Container.lua:67: 0ms | Step: 0ms
In 3 module of nn.Sequential:
In 4 module of nn.Sequential:
/home/dlm/torch/install/share/lua/5.1/nn/Dropout.lua:20: bad argument #1 to 'resizeAs' (torch.CudaTensor expected, got userdata)
stack traceback:
[C]: in function 'resizeAs'
/home/dlm/torch/install/share/lua/5.1/nn/Dropout.lua:20: in function </home/dlm/torch/install/share/lua/5.1/nn/Dropout.lua:16>
[C]: in function 'xpcall'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:63: in function 'rethrowErrors'
/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:44: in function </home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:41>
[C]: in function 'xpcall'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:63: in function 'rethrowErrors'
/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:44: in function 'forward'
train.lua:151: in function 'opfunc'
/home/dlm/torch/install/share/lua/5.1/optim/sgd.lua:44: in function 'sgd'
train.lua:160: in function 'train'
train.lua:236: in main chunk
[C]: in function 'dofile'
.../dlm/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main chunk
[C]: at 0x00406670

When I use the following code, it works great:

local SeLU = function(input)
  local output = 0.5 * (torch.abs(input) + input)
  return output
end

@dlmacedo dlmacedo reopened this May 12, 2016
@alexbw
Copy link
Collaborator

alexbw commented May 18, 2016

Is input always a torch tensor or scalar? You're returning a scalar, which might cause weirdness.

@alexbw
Copy link
Collaborator

alexbw commented May 23, 2016

Try torch.le(input, 0).value instead of torch.le(input, 0)

@dlmacedo
Copy link
Author

I am using cuddn...

Input and output are tensors...

The following code is ok:

local seluFunc = function(input)
  local output, aux, aux2, aux3
  aux = (seluc/selua+1)/2
  aux2 = ((1-aux)*torch.abs( input+selub)+aux*( input+selub))-selub
  aux3 = -((1-aux)*torch.abs(-aux2+selub)+aux*(-aux2+selub))+selub
  output = selua*aux3
  return output
end

But I still don't understand what is wrong with the code bellow:

local seluFunc2 = function(input)
  local output, lessX, equalX, graterX, aux1, aux2, aux3
  lessX = torch.lt(input, -selub)
  aux1 = torch.gt(input, -selub)
  aux2 = torch.lt(input, selub)
  equalX = torch.cmul(aux1, aux2)
  graterX = torch.gt(input, selub)
  aux3 = torch.add(lessX,equalX)
  output = torch.add(aux3,graterX)
  return output
end

Full logs:

In 3 module of nn.Sequential:
In 54 module of nn.Sequential:
In 3 module of nn.Sequential:
...lm/torch/install/share/lua/5.1/nn/BatchNormalization.lua:77: attempt to index local 'input' (a nil value)
stack traceback:
...lm/torch/install/share/lua/5.1/nn/BatchNormalization.lua:77: in function 'checkInputDim'
...lm/torch/install/share/lua/5.1/nn/BatchNormalization.lua:130: in function <...lm/torch/install/share/lua/5.1/nn/BatchNormalization.lua:128>
[C]: in function 'xpcall'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:63: in function 'rethrowErrors'
/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:84: in function </home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:78>
[C]: in function 'xpcall'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:63: in function 'rethrowErrors'
/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:84: in function </home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:78>
[C]: in function 'xpcall'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:63: in function 'rethrowErrors'
/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:84: in function 'backward'
train.lua:152: in function 'opfunc'
/home/dlm/torch/install/share/lua/5.1/optim/sgd.lua:44: in function 'sgd'
train.lua:158: in function 'train'
train.lua:234: in main chunk
[C]: in function 'dofile'
.../dlm/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main chunk
[C]: at 0x00406670

WARNING: If you see a stack trace below, it doesn't point to the place where this error occured. Please use only the one above.
stack traceback:
[C]: in function 'error'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:67: in function 'rethrowErrors'
/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:84: in function 'backward'
train.lua:152: in function 'opfunc'
/home/dlm/torch/install/share/lua/5.1/optim/sgd.lua:44: in function 'sgd'
train.lua:158: in function 'train'
train.lua:234: in main chunk
[C]: in function 'dofile'
.../dlm/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main chunk
[C]: at 0x00406670

@alexbw
Copy link
Collaborator

alexbw commented Jun 19, 2016

Does it work on plain NN, on the CPU?
On Sun, Jun 19, 2016 at 7:10 AM David Macêdo notifications@github.com
wrote:

I am using cuddn...

Input and output are tensors...

The following code is ok:

local seluFunc = function(input)
local output, aux, aux2, aux3
aux = (seluc/selua+1)/2
aux2 = ((1-aux)torch.abs( input+selub)+aux( input+selub))-selub
aux3 = -((1-aux)torch.abs(-aux2+selub)+aux(-aux2+selub))+selub
output = selua*aux3
return output
end

But I still don't understand what is wrong with the code bellow:

local seluFunc2 = function(input)
local output, lessX, equalX, graterX, aux1, aux2, aux3
lessX = torch.lt(input, -selub)
aux1 = torch.gt(input, -selub)
aux2 = torch.lt(input, selub)
equalX = torch.cmul(aux1, aux2)
graterX = torch.gt(input, selub)
aux3 = torch.add(lessX,equalX)
output = torch.add(aux3,graterX)
return output
end

Full logs:

In 3 module of nn.Sequential:

In 54 module of nn.Sequential:

In 3 module of nn.Sequential:

...lm/torch/install/share/lua/5.1/nn/BatchNormalization.lua:77: attempt to
index local 'input' (a nil value)
stack traceback:
...lm/torch/install/share/lua/5.1/nn/BatchNormalization.lua:77: in
function 'checkInputDim'
...lm/torch/install/share/lua/5.1/nn/BatchNormalization.lua:130: in
function <...lm/torch/install/share/lua/5.1/nn/BatchNormalization.lua:128>

[C]: in function 'xpcall'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:63: in function
'rethrowErrors'

/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:84: in function

[C]: in function 'xpcall'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:63: in function
'rethrowErrors'

/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:84: in function

[C]: in function 'xpcall'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:63: in function
'rethrowErrors'

/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:84: in function
'backward'
train.lua:152: in function 'opfunc'

/home/dlm/torch/install/share/lua/5.1/optim/sgd.lua:44: in function 'sgd'

train.lua:158: in function 'train'
train.lua:234: in main chunk

[C]: in function 'dofile'
.../dlm/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main
chunk
[C]: at 0x00406670

WARNING: If you see a stack trace below, it doesn't point to the place
where this error occured. Please use only the one above.
stack traceback:
[C]: in function 'error'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:67: in function
'rethrowErrors'
/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:84: in function
'backward'
train.lua:152: in function 'opfunc'

/home/dlm/torch/install/share/lua/5.1/optim/sgd.lua:44: in function 'sgd'

train.lua:158: in function 'train'
train.lua:234: in main chunk

[C]: in function 'dofile'
.../dlm/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main
chunk
[C]: at 0x00406670

You are receiving this because you commented.

Reply to this email directly, view it on GitHub
#118 (comment),
or mute the thread
https://github.com/notifications/unsubscribe/AAJ4j6iVC_i76tfAijynMxiBkjTK-qMyks5qNSOJgaJpZM4IalSN
.

@dlmacedo
Copy link
Author

No.

When trying to run plain NN on the CPU with tensor of type float (no cuda or cunn), I got the following error:

/home/dlm/torch/install/bin/luajit: /home/dlm/torch/install/share/lua/5.1/nn/Container.lua:67: 0ms | Step: 0ms
In 3 module of nn.Sequential:
In 4 module of nn.Sequential:
/home/dlm/torch/install/share/lua/5.1/nn/Dropout.lua:20: bad argument #1 to 'resizeAs' (torch.FloatTensor expected, got torch.ByteTensor)
stack traceback:
[C]: in function 'resizeAs'
/home/dlm/torch/install/share/lua/5.1/nn/Dropout.lua:20: in function </home/dlm/torch/install/share/lua/5.1/nn/Dropout.lua:16>
[C]: in function 'xpcall'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:63: in function 'rethrowErrors'
/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:44: in function </home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:41>
[C]: in function 'xpcall'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:63: in function 'rethrowErrors'
/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:44: in function 'forward'
train.lua:149: in function 'opfunc'
/home/dlm/torch/install/share/lua/5.1/optim/sgd.lua:44: in function 'sgd'
train.lua:158: in function 'train'
train.lua:234: in main chunk
[C]: in function 'dofile'
.../dlm/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main chunk
[C]: at 0x00406670

WARNING: If you see a stack trace below, it doesn't point to the place where this error occured. Please use only the one above.
stack traceback:
[C]: in function 'error'
/home/dlm/torch/install/share/lua/5.1/nn/Container.lua:67: in function 'rethrowErrors'
/home/dlm/torch/install/share/lua/5.1/nn/Sequential.lua:44: in function 'forward'
train.lua:149: in function 'opfunc'
/home/dlm/torch/install/share/lua/5.1/optim/sgd.lua:44: in function 'sgd'
train.lua:158: in function 'train'
train.lua:234: in main chunk
[C]: in function 'dofile'
.../dlm/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main chunk
[C]: at 0x00406670

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants