• Guest, before posting your code please take these rules into consideration:
    • It is required to use our BBCode feature to display your code. While within the editor click < / > or >_ and place your code within the BB Code prompt. This helps others with finding a solution by making it easier to read and easier to copy.
    • Don't share a wall of code. All we want is the problem area, the code related to your issue.


    To learn more about how to use our BBCode feature, please click here.

    Thank you, Code Forum.

Lua explaining the structure of two functions in Lua

neuronphysics

New Coder
Hi everyone,

I am trying to integrate a code which is a variational auto encoder model (where the prior of the VAE is a mixture of Gaussians) from Lua (torch) to pytorch as part of my model and I have a hard time to fully understand the Lua script (I didn't find a tutorial of the language). These two following functions are vague for me:
First starting from this line which is computing the loglikelihood.

Code:
require 'layers/GaussianLogLikelihood'
function Likelihood(K, D, M)
    local x_sample = - nn.Identity() -- [(MxN)xD]
    local mean = - nn.Identity()  -- {[(MxN)xD]}k
    local logVar = - nn.Identity() -- {[(MxN)xD]}k

    local llh_table = nn.ConcatTable()
    for k =1, K do
        local x = - nn.Identity()
        local mean_k_in = - nn.Identity()
        local logVar_k_in = - nn.Identity()

        local mean_k = mean_k_in
                        - nn.SelectTable(k)

        local logVar_k = logVar_k_in
                        - nn.SelectTable(k)

        local llh = {x, mean_k, logVar_k}
                        - nn.GaussianLogLikelihood()

        local llh_module = nn.gModule({x, mean_k_in, logVar_k_in}, {llh})
        llh_table:add(llh_module)
    end

    local out = {x_sample, mean, logVar}
                            - llh_table -- {[MxN,1]}k
                            - nn.JoinTable(2) -- [MxN,K]  -- log unNorm P
                            - nn.SoftMax()

    return nn.gModule({x_sample, mean, logVar},{out})
end
The likelihood function requires GaussianLogLikelihood function too:


Code:
function GaussianLogLikelihood:__init(name,display)
parent.__init(self)
self.gradInput = {}
end

function GaussianLogLikelihood:updateOutput(input)
-- input[1] : x [NxD]
-- input[2] : mean [NxD]
-- input[3] : logVar [NxD]
-- llh = -0.5 sum_d { (x_i - mu_i)^2/var_i } - 1/2 sum_d (logVar_i) - D/2 ln(2pi) [N]
local N = input[1]:size(1)
local D = input[1]:size(2)

self._x = self._x or torch.Tensor():typeAs(input[1]):resizeAs(input[1])
self._x:copy(input[1])
self._var = self._var or torch.Tensor():typeAs(input[3]):resizeAs(input[3])
self._var:copy(input[3])
self._var:exp()

self.output = self.output or input[1].new()
self.output:typeAs(input[1]):resize(N, 1):zero()



self.output:copy( self._x:add(-1, input[2]):pow(2):cdiv(self._var):sum(2) )
self.output:add( input[3]:sum(2) )
self.output:add( D * torch.log(2*math.pi) )
self.output:mul(-0.5)

return self.output
end


I am confused about the inputs and the outputs of each function. I appreciate if someone can describe them in terms of a pseudo code. Thanks in advance.
 

Top