Luaexplaining the structure of two functions in Lua

neuronphysics

New Coder
Hi everyone,

I am trying to integrate a code which is a variational auto encoder model (where the prior of the VAE is a mixture of Gaussians) from Lua (torch) to pytorch as part of my model and I have a hard time to fully understand the Lua script (I didn't find a tutorial of the language). These two following functions are vague for me:
First starting from this line which is computing the loglikelihood.

Code:
require 'layers/GaussianLogLikelihood'
function Likelihood(K, D, M)
local x_sample = - nn.Identity() -- [(MxN)xD]
local mean = - nn.Identity()  -- {[(MxN)xD]}k
local logVar = - nn.Identity() -- {[(MxN)xD]}k

local llh_table = nn.ConcatTable()
for k =1, K do
local x = - nn.Identity()
local mean_k_in = - nn.Identity()
local logVar_k_in = - nn.Identity()

local mean_k = mean_k_in
- nn.SelectTable(k)

local logVar_k = logVar_k_in
- nn.SelectTable(k)

local llh = {x, mean_k, logVar_k}
- nn.GaussianLogLikelihood()

local llh_module = nn.gModule({x, mean_k_in, logVar_k_in}, {llh})
end

local out = {x_sample, mean, logVar}
- llh_table -- {[MxN,1]}k
- nn.JoinTable(2) -- [MxN,K]  -- log unNorm P
- nn.SoftMax()

return nn.gModule({x_sample, mean, logVar},{out})
end
The likelihood function requires GaussianLogLikelihood function too:

Code:
function GaussianLogLikelihood:__init(name,display)
parent.__init(self)
end

function GaussianLogLikelihood:updateOutput(input)
-- input : x [NxD]
-- input : mean [NxD]
-- input : logVar [NxD]
-- llh = -0.5 sum_d { (x_i - mu_i)^2/var_i } - 1/2 sum_d (logVar_i) - D/2 ln(2pi) [N]
local N = input:size(1)
local D = input:size(2)

self._x = self._x or torch.Tensor():typeAs(input):resizeAs(input)
self._x:copy(input)
self._var = self._var or torch.Tensor():typeAs(input):resizeAs(input)
self._var:copy(input)
self._var:exp()

self.output = self.output or input.new()
self.output:typeAs(input):resize(N, 1):zero()