Posts

Showing posts from January 24, 2019

ctc forward backward algorithm - why is it being used?

Image
0 $begingroup$ as per ftp://ftp.idsia.ch/pub/juergen/icml2006.pdf fwd bw algo helps us speed up discovery of the happy path. For e.g. if the GT label is DOG and we have 10 time steps , the possible label sequences could be _DD_OO_GG_ DD_OOO_GGG DO__GGGGGG etc ..so one path could be "_DO" (index 0, 1 and 2 from 3 example possibilities shown above). Assuming most tasks will have many more time steps (and hence a much higher number of paths), how is the fwd bw algo speeding things up ? Lets say the fwd variable is calculating the prob of obtaining an "O" at time step T, does it mean that it picks the best possible path of getting to "O" (recursively) ? for e.g. if the 3 paths to "O" were "_DD" , "DDD" and "blank D blank" with probabilities...

Grandprinclando Litovio

Image
Historio de Rusio Orientaj slavoj, Rusjoj Kieva Regno 800 - 1100 jaroj Rusaj princlandoj 1100 – 1547 jaroj Grandprinclando Vladimir-Suzdal 1113 – 1236 jaroj Novgoroda Respubliko 1136 – 1478 jaroj Grandprinclando Moskvo 1276 – 1547 jaroj Grandprinclando Litovio 13 jarcento – 1795 jaro Rusa carlando 1547-1721 jaroj Rusia Imperio 1721 - 1917 jaroj Rusia respubliko 1917 jaro Rusia Soveta Federacia Socialisma Respubliko 1917 - 1922 jaroj Rusia Ŝtato 1918 - 1920 jaroj Sovetunio 1917 - 1991 jaroj Rusia federacio de 1991 jaro Glosaro Grandprinclando Litovio Origina nomo Lietuvos Didžioji Kunigaikštystė ( mezepoke Didi Kunigystė Lietuvos), Magnus Ducatus Lituaniae ( pli vidu tekste ) ←  ←  ←  ←  1236 – 1795  →  → Geografio La Grandprinclando Litovio en la 17-a jarcento Ĉefurbo: Voruta (hipoteza; antaŭ 1279) Kernavė (post 1279 – antaŭ 1321) Trakai (ĉ. 1321–1323) Vilnius (ekd...

Regression with -1,1 target range - Should we use a tanh activation in the last 1 unit dense layer?

Image
0 $begingroup$ Say in a regression problem the target range to be between [0,1] or [-1,1], and say the last layer of the network is as xx = tf.layers.dense(inputs=xx, units=1, name='Prediction') Should we use a sigmoid or tanh activation function after xx respectively to enforce the output of the 1 unit dense layer to be between [0,1] or [-1,1]? To mention that most of the regression examples that I have seen they do not add an activation function after xx , and leave it to be open and take whatever value it ends up taking. regression activation-function share asked 4 mins ago Soyol Soyol ...