Limitations of Autoregressive Models and Their Alternatives

1
Eciently Locally Normalized (ELN): an abstraction of Autoregressive Models (including ordinary RNNs/ LSTMs/Transformers/…) They parametrize probability of string x as p(x)= Q t p(x t | x <t ) p(x)= Q t p(x t | x <t ) with a fixed size parameter vector. Computing p(x t | x <t ) p(x t | x <t ) takes O (poly(t)) O (poly(t)) . Limitations of Autoregressive Models and Their Alternatives Commonly held beliefs: “RNN language models are Turing-complete. So they can model any computable language!” “RNNs can fit any finite language. If they do not fit, just add more parameters!” This work: Not really! Even with unlimited compute/annotation during training, there is a distribution over strings, that cannot be fit by any autoregressive model (e.g., RNN/Transformer), even if you allow longer strings to use larger models (with polynomial growth). But this language can be easily “fit” by a short hand-written Python program! The sequence order problem: consider the distribution p( x |{z} problem encoding # y |{z} solution candidate ) p( x |{z} problem encoding # y |{z} solution candidate ) where p(x#y ) 6=0 p(x#y ) 6=0 iff y is a solution to x. The prefix probability p(x#) > 0 p(x#) > 0 if and only if x has a solution. In other words, autoregressive models that factor p(x#y )= p(x#) · p(y | x#) p(x#y )= p(x#) · p(y | x#) must have the capacity to decide whether x has a solution, to ensure the joint distribution is accurate. If x is hard enough (e.g. NP-hard), no autoregressive models can even get the support right, as long as they use polytime/polysize (i.e. ELN/ELNCPs)! The other sequence order does fine under autoregressive models (if x is in NP): p( y |{z} solution candidate # x |{z} problem encoding )= p(y #) · p(x | y #) p( y |{z} solution candidate # x |{z} problem encoding )= p(y #) · p(x | y #) But we don’t always get to decide the sequence order Eciently Locally Normalizable with Compact Parameters (ELNCP): a generalization of ELNs. An ELNCP model has infinitely many parameter vectors. When |x| = n |x| = n , an ELCNP model uses parameters n n to compute p(x)= Q t p(x t | x <t ) p(x)= Q t p(x t | x <t ) . They provide a conceptual upper bound to the just-train-a-slightly-larger-model paradigm for autoregressive models. ELNCP weighted languages can have support outside of P because of the precomputed parameters. P: a decision problem class. It is the set of all languages that can be decided in polynomial time. Eciently Computable (EC): an abstraction of Energy-Based Models (EBMs). A normalizable eciently computable weighted language defines p(x) / ˜ p(x) p(x) / ˜ p(x) where ˜ p(x) ˜ p(x) can be computed in O (poly(|x|)) O (poly(|x|)) . Their support can be (and can only be) anything in P. Why is it bad that ELNCP models can't decide all languages in P? Because then they can't choose among continuations of a prompt. That is, there's no way to ensure that p(x#y) > 0 iy is a valid continuation of prompt x, even if that property can be checked in polytime. P/poly: P with the help of poly-sized advice strings that can come from an oracle. P/poly is therefore more powerful than P — they can model undecidable problems due to the oracle access! Eciently Computable with Compact Parameters (ECCP): is a generalization of ECs. Similar to ELNCPs, ECCPs is a conceptual upper bound to the just-train-a-slightly-larger-model paradigm of EBMs. Fix #1: use EBMs EBMs do not suer the sequence order problem because they don't even try to compute the possibly expensive factors p(x t | x <t ) p(x t | x <t ) ! Downside: it is not easy to sample from EBMs. Training them requires estimating the partition function. Fix #2: marginalize A Lightly marginalized ELNCP model marginalizes over an ELNCP language (lightly so because it des not have too many latent variables). The sequence of latent and observed symbols can be sampled from the ELNCP model. Intuitively, they avoid the sequence order problem with latent variables: p( x |{z} problem encoding # y |{z} solution candidate )= p( y |{z} latent variable #) · p( x |{z} problem encoding | y |{z} latent variable #) p( x |{z} problem encoding # y |{z} solution candidate )= p( y |{z} latent variable #) · p( x |{z} problem encoding | y |{z} latent variable #) They can have any language in NP/Poly as support! p( x |{z} problem encoding )= X y2V p( y |{z} latent variable #) · p( x |{z} problem encoding | y |{z} latent variable #) p( x |{z} problem encoding )= X y2V p( y |{z} latent variable #) · p( x |{z} problem encoding | y |{z} latent variable #) Downside: marginalization is required even at test time. Fix #3: memorize anything we need We can model anything if we have a big big database! Examples: kNNLM, adaptive semi parametric language models, … Downside: Need a vast database of observed or precomputed answers. NP-complete (NPC): is a set of languages that are widely believed to be outside P/poly (and therefore cannot be support of ECCP languages) P (V ) The space of unweighted languages. Each rectangular outline corresponds to a complexity class and encloses the languages whose decision problems fall into that class. Each shape (whose name is colored to match the shape outline) corresponds to a model family and encloses the languages that can be expressed as the support of some weighted language in that family. Future work: Average-case analysis? Are there model families that have all the good stubut none of the bad stu? Chu-Cheng Lin* , Aaron Jaech , Xin Li , Matt Gormley , Jason Eisner Johns Hopkins University Facebook AI Carnegie Mellon University Model family Compact parameters? Ecient scoring? Ecient sampling and normalization? Support can be . . . ELN/ELNCP: Autoregressive models (§3.1) 3 3 3 some but not all ! 2 P EC/ECCP: Energy-based models (§4.1) 3 3 7 all ! 2 P but no ! 2 NPC Lightly marginalized ELNCP: Latent-variable autoregressive models (§4.2) 3 7 3 all ! 2 NP Lookup models (§4.3) 7 3 3 anything * [email protected]

Transcript of Limitations of Autoregressive Models and Their Alternatives

Page 1: Limitations of Autoregressive Models and Their Alternatives

Efficiently Locally Normalized (ELN):an abstraction of Autoregressive Models (including ordinary RNNs/LSTMs/Transformers/…) They parametrize probability of string x as

<latexit sha1_base64="/URqpiZd0/dCW0yjBgB0oj7lcAk=">AAACHHicdVDLSsNAFJ34rPVVdelmsAh1YUkUfIBCwYUuFawKTQmTyUSHTpJh5kZaQsDfcOOvuHHhAzcuBH/Br3DaKr4PzHA4517uvceXgmuw7RdrYHBoeGS0MFYcn5icmi7NzB7pJFWU1WkiEnXiE80Ej1kdOAh2IhUjkS/Ysd/a6frH50xpnsSH0JGsGZHTmIecEjCSV1qVFTcicOaHWTtfwtvYlSoJPMCy0ja/G/EAfxZ42RbkS16pbFc3N20D/Js4VbuHcm159/4V8ot9r/TkBglNIxYDFUTrhmNLaGZEAaeC5UU31UwS2iKnrGFoTCKmm1nvuBwvGiXAYaLMiwH31K8dGYm07kS+qezuqX96XfEvr5FCuNHMeCxTYDHtDwpTgSHB3aRwwBWjIDqGEKq42RXTM6IIBZNn0YTwcSn+nxytVJ216sqBU67toT4KaB4toApy0DqqoT20j+qIokt0jW7RnXVl3VgP1mO/dMB675lD32A9vwF5oaVp</latexit>

p(x) =Q

t p(xt | x<t)<latexit sha1_base64="/URqpiZd0/dCW0yjBgB0oj7lcAk=">AAACHHicdVDLSsNAFJ34rPVVdelmsAh1YUkUfIBCwYUuFawKTQmTyUSHTpJh5kZaQsDfcOOvuHHhAzcuBH/Br3DaKr4PzHA4517uvceXgmuw7RdrYHBoeGS0MFYcn5icmi7NzB7pJFWU1WkiEnXiE80Ej1kdOAh2IhUjkS/Ysd/a6frH50xpnsSH0JGsGZHTmIecEjCSV1qVFTcicOaHWTtfwtvYlSoJPMCy0ja/G/EAfxZ42RbkS16pbFc3N20D/Js4VbuHcm159/4V8ot9r/TkBglNIxYDFUTrhmNLaGZEAaeC5UU31UwS2iKnrGFoTCKmm1nvuBwvGiXAYaLMiwH31K8dGYm07kS+qezuqX96XfEvr5FCuNHMeCxTYDHtDwpTgSHB3aRwwBWjIDqGEKq42RXTM6IIBZNn0YTwcSn+nxytVJ216sqBU67toT4KaB4toApy0DqqoT20j+qIokt0jW7RnXVl3VgP1mO/dMB675lD32A9vwF5oaVp</latexit>

p(x) =Q

t p(xt | x<t) with a fixed size parameter vector. Computing <latexit sha1_base64="BsGN66qgwYR+ObI/6DE49ikaWyY=">AAACBHicdVC7SgNBFJ2Nrxhf8dGlGQxCbMImhRqwCFiYMoJ5QBKW2dnZZMjsg5m7krhsYeOv2IgoYutH2Fn6J04SBZ8HZjiccy/33mOHgiswzVcjNTe/sLiUXs6srK6tb2Q3t5oqiCRlDRqIQLZtopjgPmsAB8HaoWTEswVr2cOTid+6YFLxwD+Hcch6Hun73OWUgJasbC4sjCzAXY87+iMwsN14lFjxMST7VjZvFisVUwP/JqWiOUW+Wrh8O73bietW9qXrBDTymA9UEKU6JTOEXkwkcCpYkulGioWEDkmfdTT1icdUL54ekeA9rTjYDaR+PuCp+rUjJp5SY8/WlZM91U9vIv7ldSJwj3ox98MImE9ng9xIYAjwJBHscMkoiLEmhEqud8V0QCShoHPL6BA+L8X/k2a5WDools90GjU0Qxrl0C4qoBI6RFVUQ3XUQBRdoRt0jx6Ma+PWeDSeZqUp46NnG32D8fwOCh2bZg==</latexit>

p(xt | x<t)<latexit sha1_base64="BsGN66qgwYR+ObI/6DE49ikaWyY=">AAACBHicdVC7SgNBFJ2Nrxhf8dGlGQxCbMImhRqwCFiYMoJ5QBKW2dnZZMjsg5m7krhsYeOv2IgoYutH2Fn6J04SBZ8HZjiccy/33mOHgiswzVcjNTe/sLiUXs6srK6tb2Q3t5oqiCRlDRqIQLZtopjgPmsAB8HaoWTEswVr2cOTid+6YFLxwD+Hcch6Hun73OWUgJasbC4sjCzAXY87+iMwsN14lFjxMST7VjZvFisVUwP/JqWiOUW+Wrh8O73bietW9qXrBDTymA9UEKU6JTOEXkwkcCpYkulGioWEDkmfdTT1icdUL54ekeA9rTjYDaR+PuCp+rUjJp5SY8/WlZM91U9vIv7ldSJwj3ox98MImE9ng9xIYAjwJBHscMkoiLEmhEqud8V0QCShoHPL6BA+L8X/k2a5WDools90GjU0Qxrl0C4qoBI6RFVUQ3XUQBRdoRt0jx6Ma+PWeDSeZqUp46NnG32D8fwOCh2bZg==</latexit>

p(xt | x<t) takes <latexit sha1_base64="EuXOBh5v4AVoG9AD2OpX6WLIuYo=">AAAB/HicdVDLSsNAFJ3UV62vaN25GSxCuglpF2p3BRd2ZwX7gDaUyXTSDp1MwsxEiKH+ihsXiujSD3Hn0j9x0ir4PDBwOOde7pnjRYxK5TivRm5hcWl5Jb9aWFvf2Nwyt3faMowFJi0cslB0PSQJo5y0FFWMdCNBUOAx0vEmJ5nfuSRC0pBfqCQiboBGnPoUI6WlgVk8s/oBUmMRpFHIkqmlyuWBWXLsWs3RgL9JxXZmKNWtq7fTp920OTBf+sMQxwHhCjMkZa/iRMpNkVAUMzIt9GNJIoQnaER6mnIUEOmms/BTeKCVIfRDoR9XcKZ+3UhRIGUSeHoyCyp/epn4l9eLlX/sppRHsSIczw/5MYMqhFkTcEgFwYolmiAsqM4K8RgJhJXuq6BL+Pwp/J+0q3bl0K6e6zYaYI482AP7wAIVcATqoAGaoAUwSMANuAP3xrVxazwYj/PRnPGxUwTfYDy/A+KUl/U=</latexit>

O(poly(t))<latexit sha1_base64="EuXOBh5v4AVoG9AD2OpX6WLIuYo=">AAAB/HicdVDLSsNAFJ3UV62vaN25GSxCuglpF2p3BRd2ZwX7gDaUyXTSDp1MwsxEiKH+ihsXiujSD3Hn0j9x0ir4PDBwOOde7pnjRYxK5TivRm5hcWl5Jb9aWFvf2Nwyt3faMowFJi0cslB0PSQJo5y0FFWMdCNBUOAx0vEmJ5nfuSRC0pBfqCQiboBGnPoUI6WlgVk8s/oBUmMRpFHIkqmlyuWBWXLsWs3RgL9JxXZmKNWtq7fTp920OTBf+sMQxwHhCjMkZa/iRMpNkVAUMzIt9GNJIoQnaER6mnIUEOmms/BTeKCVIfRDoR9XcKZ+3UhRIGUSeHoyCyp/epn4l9eLlX/sppRHsSIczw/5MYMqhFkTcEgFwYolmiAsqM4K8RgJhJXuq6BL+Pwp/J+0q3bl0K6e6zYaYI482AP7wAIVcATqoAGaoAUwSMANuAP3xrVxazwYj/PRnPGxUwTfYDy/A+KUl/U=</latexit>

O(poly(t)) .

Limitations of Autoregressive Models and Their Alternatives

Commonly held beliefs:“RNN language models are Turing-complete. So they can model any computable language!”“RNNs can fit any finite language. If they do not fit, just add more parameters!”This work:Not really! Even with unlimited compute/annotation during training, there is a distribution over strings, that cannot be fit by any autoregressive model (e.g., RNN/Transformer), even if you allow longer strings to use larger models (with polynomial growth).But this language can be easily “fit” by a short hand-written Python program!

The sequence order problem: consider the distribution

<latexit sha1_base64="/OSCARpBScBreJ3wsifaUrj5v1s=">AAACeHicbVFNTxsxEPVuP6BpaUN75FCrKSpcot1UAo5IvXCkEgGkOIq83tlg4bVX9hgRWfsb+BH8o976Q3rpqd4EVS1kJEvPbzxvPG+KRkmHWfYzSZ89f/FyY/NV7/Wbrbfv+tvvz53xVsBYGGXsZcEdKKlhjBIVXDYWeF0ouCiuv3X5ixuwThp9hosGpjWfa1lJwTFSs/5ds8e8LsEWlgsIrOZ4VVThtm1ngTlfOOTiOjCEWwyNNVG2bhlb3UELU0o9b9uWDdapLNapOKN81/uvjOC6lCVHiDr7s/4gG2bLoE9B/gAGx2fbWx/vv96czvo/WGmEr0GjUNy5SZ41OA3cohQK2h7zDprYnc9hEqHmNbhpWBrX0t3IlLQyNh6NdMn+WxF47dyiLuLLbiT3ONeR63ITj9XRNEjdeIwurRpVXlE0tNsCLaUFgWoRARdWxr9SccWjeRh31Ysm5I9HfgrOR8P8YDj6Ht04IavYJDvkE9kjOTkkx+SEnJIxEeRXspN8TnaT3ylNv6T7q6dp8lDzgfwX6egPrP3JRQ==</latexit>

p( x|{z}problemencoding

# y|{z}solutioncandidate

)<latexit sha1_base64="/OSCARpBScBreJ3wsifaUrj5v1s=">AAACeHicbVFNTxsxEPVuP6BpaUN75FCrKSpcot1UAo5IvXCkEgGkOIq83tlg4bVX9hgRWfsb+BH8o976Q3rpqd4EVS1kJEvPbzxvPG+KRkmHWfYzSZ89f/FyY/NV7/Wbrbfv+tvvz53xVsBYGGXsZcEdKKlhjBIVXDYWeF0ouCiuv3X5ixuwThp9hosGpjWfa1lJwTFSs/5ds8e8LsEWlgsIrOZ4VVThtm1ngTlfOOTiOjCEWwyNNVG2bhlb3UELU0o9b9uWDdapLNapOKN81/uvjOC6lCVHiDr7s/4gG2bLoE9B/gAGx2fbWx/vv96czvo/WGmEr0GjUNy5SZ41OA3cohQK2h7zDprYnc9hEqHmNbhpWBrX0t3IlLQyNh6NdMn+WxF47dyiLuLLbiT3ONeR63ITj9XRNEjdeIwurRpVXlE0tNsCLaUFgWoRARdWxr9SccWjeRh31Ysm5I9HfgrOR8P8YDj6Ht04IavYJDvkE9kjOTkkx+SEnJIxEeRXspN8TnaT3ylNv6T7q6dp8lDzgfwX6egPrP3JRQ==</latexit>

p( x|{z}problemencoding

# y|{z}solutioncandidate

)

where <latexit sha1_base64="FXmIHTMyO/DHPCcikdDHmAYOxB4=">AAACDXicbVC7TsMwFHV4lvIKj43FoiCVpUo6AGMlBjoWiT6kJqoc12mtOk6wHUSI8gMs/AliYQAhVnY2Rv4E9yEELUeydHzOvbr3Hi9iVCrL+jTm5hcWl5ZzK/nVtfWNTXNruyHDWGBSxyELRctDkjDKSV1RxUgrEgQFHiNNb3A29JvXREga8kuVRMQNUI9Tn2KktNQxD6Ji6gRI9T0/vckyp/DzS7LsCDqcXEGrYxaskjUCnCX2hBQqxduv84fdtNYxP5xuiOOAcIUZkrJtW5FyUyQUxYxkeSeWJEJ4gHqkrSlHAZFuOromg4da6UI/FPpxBUfq744UBVImgacrh6vKaW8o/ue1Y+WfuinlUawIx+NBfsygCuEwGtilgmDFEk0QFlTvCnEfCYSVDjCvQ7CnT54ljXLJPi6VL3QaVTBGDuyBfVAENjgBFVAFNVAHGNyBR/AMXox748l4Nd7GpXPGpGcH/IHx/g30RJ8g</latexit>

p(x#y) 6= 0<latexit sha1_base64="FXmIHTMyO/DHPCcikdDHmAYOxB4=">AAACDXicbVC7TsMwFHV4lvIKj43FoiCVpUo6AGMlBjoWiT6kJqoc12mtOk6wHUSI8gMs/AliYQAhVnY2Rv4E9yEELUeydHzOvbr3Hi9iVCrL+jTm5hcWl5ZzK/nVtfWNTXNruyHDWGBSxyELRctDkjDKSV1RxUgrEgQFHiNNb3A29JvXREga8kuVRMQNUI9Tn2KktNQxD6Ji6gRI9T0/vckyp/DzS7LsCDqcXEGrYxaskjUCnCX2hBQqxduv84fdtNYxP5xuiOOAcIUZkrJtW5FyUyQUxYxkeSeWJEJ4gHqkrSlHAZFuOromg4da6UI/FPpxBUfq744UBVImgacrh6vKaW8o/ue1Y+WfuinlUawIx+NBfsygCuEwGtilgmDFEk0QFlTvCnEfCYSVDjCvQ7CnT54ljXLJPi6VL3QaVTBGDuyBfVAENjgBFVAFNVAHGNyBR/AMXox748l4Nd7GpXPGpGcH/IHx/g30RJ8g</latexit>

p(x#y) 6= 0 iff y is a solution to x.The prefix probability

<latexit sha1_base64="XftikyF/ybcBQwIV9tJZt9d2o5E=">AAAB/HicbVC7SgNBFL3rM8bXauxsBoMQm7CbQq0kYGHKCOYByRJmJ7PJkNkHM7PiusRfsbFQREs/xM7SP3E2SaGJBwYO59zLPXPciDOpLOvLWFpeWV1bz23kN7e2d3bNvf2mDGNBaIOEPBRtF0vKWUAbiilO25Gg2Hc5bbmjy8xv3VIhWRjcqCSijo8HAfMYwUpLPbMQlbo+VkPXS+/G3eIJukBWzyxaZWsCtEjsGSlWS/ffV+8Hab1nfnb7IYl9GijCsZQd24qUk2KhGOF0nO/GkkaYjPCAdjQNsE+lk07Cj9GxVvrIC4V+gUIT9fdGin0pE9/Vk1lQOe9l4n9eJ1beuZOyIIoVDcj0kBdzpEKUNYH6TFCieKIJJoLprIgMscBE6b7yugR7/suLpFkp26flyrVuowZT5OAQjqAENpxBFWpQhwYQSOARnuHFeDCejFfjbTq6ZMx2CvAHxscPR52W5Q==</latexit>

p(x#) > 0<latexit sha1_base64="XftikyF/ybcBQwIV9tJZt9d2o5E=">AAAB/HicbVC7SgNBFL3rM8bXauxsBoMQm7CbQq0kYGHKCOYByRJmJ7PJkNkHM7PiusRfsbFQREs/xM7SP3E2SaGJBwYO59zLPXPciDOpLOvLWFpeWV1bz23kN7e2d3bNvf2mDGNBaIOEPBRtF0vKWUAbiilO25Gg2Hc5bbmjy8xv3VIhWRjcqCSijo8HAfMYwUpLPbMQlbo+VkPXS+/G3eIJukBWzyxaZWsCtEjsGSlWS/ffV+8Hab1nfnb7IYl9GijCsZQd24qUk2KhGOF0nO/GkkaYjPCAdjQNsE+lk07Cj9GxVvrIC4V+gUIT9fdGin0pE9/Vk1lQOe9l4n9eJ1beuZOyIIoVDcj0kBdzpEKUNYH6TFCieKIJJoLprIgMscBE6b7yugR7/suLpFkp26flyrVuowZT5OAQjqAENpxBFWpQhwYQSOARnuHFeDCejFfjbTq6ZMx2CvAHxscPR52W5Q==</latexit>

p(x#) > 0 if and only if x has a solution.

In other words, autoregressive models that factor <latexit sha1_base64="y2nnxOtiY6DEQ11ZSaqIxP2Kc/Q=">AAACQHicbVA5T8MwGHU4S7nCsbFYVEjtUiUdgAWpEgMdi0QPqYkqx3Faq84h20GEKH+Jf8DCT2BjRCwMIMTKhHsgSsuTLD2/9z7583MiRoU0jCdtYXFpeWU1t5Zf39jc2tZ3dpsijDkmDRyykLcdJAijAWlIKhlpR5wg32Gk5QzOh37rmnBBw+BKJhGxfdQLqEcxkkrq6q2oaPlI9h0vvcmgVYA/tyQrwTM465aghd1QKn0qqCh14W/QKpRgVy8YZWMEOE/MCSlUi7fPF3f7ab2rP1puiGOfBBIzJETHNCJpp4hLihnJ8lYsSITwAPVIR9EA+UTY6aiADB4pxYVeyNUJJByp0xMp8oVIfEclh1uKWW8o/ud1Yumd2ikNoliSAI8f8mIGZQiHbUKXcoIlSxRBmFO1K8R9xBGWqvO8KsGc/fI8aVbK5nG5cqnaqIExcuAAHIIiMMEJqIIaqIMGwOAevIA38K49aK/ah/Y5ji5ok5k98Afa1zezPrGy</latexit>

p(x#y) = p(x#) · p(y | x#)<latexit sha1_base64="y2nnxOtiY6DEQ11ZSaqIxP2Kc/Q=">AAACQHicbVA5T8MwGHU4S7nCsbFYVEjtUiUdgAWpEgMdi0QPqYkqx3Faq84h20GEKH+Jf8DCT2BjRCwMIMTKhHsgSsuTLD2/9z7583MiRoU0jCdtYXFpeWU1t5Zf39jc2tZ3dpsijDkmDRyykLcdJAijAWlIKhlpR5wg32Gk5QzOh37rmnBBw+BKJhGxfdQLqEcxkkrq6q2oaPlI9h0vvcmgVYA/tyQrwTM465aghd1QKn0qqCh14W/QKpRgVy8YZWMEOE/MCSlUi7fPF3f7ab2rP1puiGOfBBIzJETHNCJpp4hLihnJ8lYsSITwAPVIR9EA+UTY6aiADB4pxYVeyNUJJByp0xMp8oVIfEclh1uKWW8o/ud1Yumd2ikNoliSAI8f8mIGZQiHbUKXcoIlSxRBmFO1K8R9xBGWqvO8KsGc/fI8aVbK5nG5cqnaqIExcuAAHIIiMMEJqIIaqIMGwOAevIA38K49aK/ah/Y5ji5ok5k98Afa1zezPrGy</latexit>

p(x#y) = p(x#) · p(y | x#) must have the capacity to decide whether x has a solution, to ensure the joint distribution is accurate. If x is hard enough (e.g. NP-hard), no autoregressive models can even get the support right, as long as they use polytime/polysize (i.e. ELN/ELNCPs)!The other sequence order does fine under autoregressive models (if x is in NP):

<latexit sha1_base64="HJ+DPan1MNAA6rB5EE2QOu8vnPM=">AAACsXicbVFNbxMxEPUuXyVQmsKRAxYRUnOJdlsJuCBV4tJjkZK2KF6C1zubWvHHyp6tEq321/BnOHPj3+BNqiq0GcnS05uZ5+fnvFLSY5L8jeJHj588fbb3vPfi5f6rg/7h6wtvaydgIqyy7irnHpQ0MEGJCq4qB1znCi7zxdeuf3kDzktrxriqINN8bmQpBcdAzfq/qyNWmwJc7riAhmmO13nZrNp21jBf5x65WDQMYYmNt6rutlrGNoTgppAFR2jblg126Sx36VTOBnv6TgaMsIU086AypF9ocHTngg3okDJRWNyily1lWhZ0e2w46w+SUbIu+hCkt2BwOj7cf/fr5OZ81v/DCitqDQaF4t5P06TCrOEOpVDQ9ljtoQqm+RymARquwWfNOvGWfghMQUvrwjFI1+z2RsO19yudh8nOpL/f68hdvWmN5eeskaaqMcSyuaisFUVLu++jhXQgUK0C4MLJ4JWKax4yx/DJvRBCev/JD8HF8Sj9ODr+FtI4I5vaI2/Je3JEUvKJnJIzck4mRESjaBxl0Y/4JP4e/4zzzWgc3e68If9VvPgHlQzddg==</latexit>

p( y|{z}solutioncandidate

# x|{z}problemencoding

) = p(y#) · p(x | y#)<latexit sha1_base64="HJ+DPan1MNAA6rB5EE2QOu8vnPM=">AAACsXicbVFNbxMxEPUuXyVQmsKRAxYRUnOJdlsJuCBV4tJjkZK2KF6C1zubWvHHyp6tEq321/BnOHPj3+BNqiq0GcnS05uZ5+fnvFLSY5L8jeJHj588fbb3vPfi5f6rg/7h6wtvaydgIqyy7irnHpQ0MEGJCq4qB1znCi7zxdeuf3kDzktrxriqINN8bmQpBcdAzfq/qyNWmwJc7riAhmmO13nZrNp21jBf5x65WDQMYYmNt6rutlrGNoTgppAFR2jblg126Sx36VTOBnv6TgaMsIU086AypF9ocHTngg3okDJRWNyily1lWhZ0e2w46w+SUbIu+hCkt2BwOj7cf/fr5OZ81v/DCitqDQaF4t5P06TCrOEOpVDQ9ljtoQqm+RymARquwWfNOvGWfghMQUvrwjFI1+z2RsO19yudh8nOpL/f68hdvWmN5eeskaaqMcSyuaisFUVLu++jhXQgUK0C4MLJ4JWKax4yx/DJvRBCev/JD8HF8Sj9ODr+FtI4I5vaI2/Je3JEUvKJnJIzck4mRESjaBxl0Y/4JP4e/4zzzWgc3e68If9VvPgHlQzddg==</latexit>

p( y|{z}solutioncandidate

# x|{z}problemencoding

) = p(y#) · p(x | y#)

But we don’t always get to decide the sequence order

!

Efficiently Locally Normalizable with Compact Parameters (ELNCP):a generalization of ELNs. An ELNCP model has infinitely many parameter vectors. When

<latexit sha1_base64="3/vzuVjEdvzinL34BTuX6iptZrw=">AAAB9XicdVDLSgMxFL3js9ZXfezcBIvQVZmpYOtCKLiwywr2AW0tmTTThmYyQ5JR67T/4caFIi71X9y59E9MWwWfBwKHc+7lnhw35Exp2361Zmbn5hcWE0vJ5ZXVtfXUxmZVBZEktEICHsi6ixXlTNCKZprTeigp9l1Oa27/eOzXLqhULBBnehDSlo+7gnmMYG2k82HTx7rnevHVaHgk2qm0nT0s5PcLOfSbOFl7gnQxc/128rQdl9upl2YnIJFPhSYcK9Vw7FC3Yiw1I5yOks1I0RCTPu7ShqEC+1S14knqEdozSgd5gTRPaDRRv27E2Fdq4LtmchxS/fTG4l9eI9JeoRUzEUaaCjI95EUc6QCNK0AdJinRfGAIJpKZrIj0sMREm6KSpoTPn6L/STWXdQ6yuVMnXSzBFAnYgV3IgAN5KEIJylABAhJu4A7urUvr1nqwHqejM9bHzhZ8g/X8Di1hlqc=</latexit>

|x| = n<latexit sha1_base64="3/vzuVjEdvzinL34BTuX6iptZrw=">AAAB9XicdVDLSgMxFL3js9ZXfezcBIvQVZmpYOtCKLiwywr2AW0tmTTThmYyQ5JR67T/4caFIi71X9y59E9MWwWfBwKHc+7lnhw35Exp2361Zmbn5hcWE0vJ5ZXVtfXUxmZVBZEktEICHsi6ixXlTNCKZprTeigp9l1Oa27/eOzXLqhULBBnehDSlo+7gnmMYG2k82HTx7rnevHVaHgk2qm0nT0s5PcLOfSbOFl7gnQxc/128rQdl9upl2YnIJFPhSYcK9Vw7FC3Yiw1I5yOks1I0RCTPu7ShqEC+1S14knqEdozSgd5gTRPaDRRv27E2Fdq4LtmchxS/fTG4l9eI9JeoRUzEUaaCjI95EUc6QCNK0AdJinRfGAIJpKZrIj0sMREm6KSpoTPn6L/STWXdQ6yuVMnXSzBFAnYgV3IgAN5KEIJylABAhJu4A7urUvr1nqwHqejM9bHzhZ8g/X8Di1hlqc=</latexit>

|x| = n, an ELCNP model uses parameters <latexit sha1_base64="SvRDVoe4ImmGmTFWcNLjl5KIGLU=">AAAB+nicdVDLSgMxFM3UV62vVpciBIvgapipYOuu4KbLCvYBnaFk0kwbmnmQ3FHq2E9x40IRt36Bn+DOlb9iplXweSBwOOde7snxYsEVWNarkVtYXFpeya8W1tY3NreKpe22ihJJWYtGIpJdjygmeMhawEGwbiwZCTzBOt74NPM7F0wqHoXnMImZG5BhyH1OCWipXyw5AYGR56cOjBiQaV9rZcs8qVWPahX8m9imNUO5vvd8RZV4a/aLL84goknAQqCCKNWzrRjclEjgVLBpwUkUiwkdkyHraRqSgCk3nUWf4gOtDLAfSf1CwDP160ZKAqUmgacns6Dqp5eJf3m9BPyam/IwToCFdH7ITwSGCGc94AGXjIKYaEKo5DorpiMiCQXdVkGX8PlT/D9pV0z72Kyc2eV6A82RR7toHx0iG1VRHTVQE7UQRZfoBt2he+PauDUejMf5aM742NlB32A8vQN5DZiN</latexit>

✓n<latexit sha1_base64="SvRDVoe4ImmGmTFWcNLjl5KIGLU=">AAAB+nicdVDLSgMxFM3UV62vVpciBIvgapipYOuu4KbLCvYBnaFk0kwbmnmQ3FHq2E9x40IRt36Bn+DOlb9iplXweSBwOOde7snxYsEVWNarkVtYXFpeya8W1tY3NreKpe22ihJJWYtGIpJdjygmeMhawEGwbiwZCTzBOt74NPM7F0wqHoXnMImZG5BhyH1OCWipXyw5AYGR56cOjBiQaV9rZcs8qVWPahX8m9imNUO5vvd8RZV4a/aLL84goknAQqCCKNWzrRjclEjgVLBpwUkUiwkdkyHraRqSgCk3nUWf4gOtDLAfSf1CwDP160ZKAqUmgacns6Dqp5eJf3m9BPyam/IwToCFdH7ITwSGCGc94AGXjIKYaEKo5DorpiMiCQXdVkGX8PlT/D9pV0z72Kyc2eV6A82RR7toHx0iG1VRHTVQE7UQRZfoBt2he+PauDUejMf5aM742NlB32A8vQN5DZiN</latexit>

✓n to compute

<latexit sha1_base64="nDz3V9+vxCVQ0YrmgfbZfn2ZBPc=">AAACHXicdVDLattAFB3l6TovJ112M9QEnEWM5ITYgRQMXTTLBOokYBkxGo3swSNpmLkKNkKQ78imv9JNFjGli25Kf6FfkZGdkPeFGQ7nnMu99/hScA22/c+am19YXFoufSivrK6tb1Q2t850kirKOjQRibrwiWaCx6wDHAS7kIqRyBfs3B9+LfTzS6Y0T+LvMJasF5F+zENOCRjKq+zLmhsRGPhhNsp38BfsSpUEHmBZG5nfjXiAHw1edgT5TtmrVO36Yau512rg18Cp29Oqtne/Tf5DfnXiVf64QULTiMVABdG669gSehlRwKlgedlNNZOEDkmfdQ2MScR0L5tel+NtwwQ4TJR5MeAp+7QjI5HW48g3zmJR/VIryLe0bgphq5fxWKbAYjobFKYCQ4KLqHDAFaMgxgYQqrjZFdMBUYSCCbQI4eFS/D44a9Sdg3rj1Km2j9GsSugT+oxqyEFN1EbH6AR1EEXX6Ce6RRPrh3Vj/bJ+z6xz1n3PR/SsrL93EzOltg==</latexit>

p(x) =Q

t p(xt | x<t)<latexit sha1_base64="nDz3V9+vxCVQ0YrmgfbZfn2ZBPc=">AAACHXicdVDLattAFB3l6TovJ112M9QEnEWM5ITYgRQMXTTLBOokYBkxGo3swSNpmLkKNkKQ78imv9JNFjGli25Kf6FfkZGdkPeFGQ7nnMu99/hScA22/c+am19YXFoufSivrK6tb1Q2t850kirKOjQRibrwiWaCx6wDHAS7kIqRyBfs3B9+LfTzS6Y0T+LvMJasF5F+zENOCRjKq+zLmhsRGPhhNsp38BfsSpUEHmBZG5nfjXiAHw1edgT5TtmrVO36Yau512rg18Cp29Oqtne/Tf5DfnXiVf64QULTiMVABdG669gSehlRwKlgedlNNZOEDkmfdQ2MScR0L5tel+NtwwQ4TJR5MeAp+7QjI5HW48g3zmJR/VIryLe0bgphq5fxWKbAYjobFKYCQ4KLqHDAFaMgxgYQqrjZFdMBUYSCCbQI4eFS/D44a9Sdg3rj1Km2j9GsSugT+oxqyEFN1EbH6AR1EEXX6Ce6RRPrh3Vj/bJ+z6xz1n3PR/SsrL93EzOltg==</latexit>

p(x) =Q

t p(xt | x<t). They provide a conceptual upper bound to the just-train-a-slightly-larger-model paradigm for autoregressive models.ELNCP weighted languages can have support outside of P because of the precomputed parameters.

P:a decision problem class. It is the set of all languages that can be decided in polynomial time.Efficiently Computable (EC):an abstraction of Energy-Based Models (EBMs). A normalizable efficiently computable weighted language defines

<latexit sha1_base64="nNkXYRoJQJbMOP5QVdzFL/6Ozfs=">AAACFnicdVC7TsMwFHV4lvIKj43FokIqA1XSAehWiYGORaIPqakqx3Faq05i2Q6iRPkKFr6CnYUBhFgRGyN/gtOCVF5HsnR0zr3yPcfljEplWW/GzOzc/MJibim/vLK6tm5ubDZlFAtMGjhikWi7SBJGQ9JQVDHS5oKgwGWk5Q5PMr91QYSkUXiuRpx0A9QPqU8xUlrqmQcJLzoBUgPXTy7TfehwEXEVQUdR5pGEp9Nu2jMLVqlSsTTgb2KXrDEK1eLV++ntdlLvma+OF+E4IKHCDEnZsS2uugkSimJG0rwTS8IRHqI+6WgaooDIbjKOlcI9rXjQj4R+oYJjdXojQYGUo8DVk9mN8qeXiX95nVj5x92EhjxWJMSTj/yYQZ076wh6VBCs2EgThAXVt0I8QAJhpZvM6xK+ksL/SbNcsg9L5TO7UK2BCXJgB+yCIrDBEaiCGqiDBsDgGtyBB/Bo3Bj3xpPxPBmdMT53tsA3GC8f9i6jkA==</latexit>

p(x) / p̃(x)<latexit sha1_base64="nNkXYRoJQJbMOP5QVdzFL/6Ozfs=">AAACFnicdVC7TsMwFHV4lvIKj43FokIqA1XSAehWiYGORaIPqakqx3Faq05i2Q6iRPkKFr6CnYUBhFgRGyN/gtOCVF5HsnR0zr3yPcfljEplWW/GzOzc/MJibim/vLK6tm5ubDZlFAtMGjhikWi7SBJGQ9JQVDHS5oKgwGWk5Q5PMr91QYSkUXiuRpx0A9QPqU8xUlrqmQcJLzoBUgPXTy7TfehwEXEVQUdR5pGEp9Nu2jMLVqlSsTTgb2KXrDEK1eLV++ntdlLvma+OF+E4IKHCDEnZsS2uugkSimJG0rwTS8IRHqI+6WgaooDIbjKOlcI9rXjQj4R+oYJjdXojQYGUo8DVk9mN8qeXiX95nVj5x92EhjxWJMSTj/yYQZ076wh6VBCs2EgThAXVt0I8QAJhpZvM6xK+ksL/SbNcsg9L5TO7UK2BCXJgB+yCIrDBEaiCGqiDBsDgGtyBB/Bo3Bj3xpPxPBmdMT53tsA3GC8f9i6jkA==</latexit>

p(x) / p̃(x) where <latexit sha1_base64="p1C/CJ+wQdmYeV4t6vdtMoQG50E=">AAAB/nicdVDLSgMxFM34rPU1vlZugkWom2GmC7W7ggu7rGAf0BlKJpNpQzMPkoxYQ8FfceNCUbd+hzuX/omZVsHngcDhnHu5J8dPGRXStl+Nmdm5+YXFwlJxeWV1bd3c2GyJJOOYNHHCEt7xkSCMxqQpqWSkk3KCIp+Rtj88yf32BeGCJvG5HKXEi1A/piHFSGqpZ+64krKAqHRcdiMkB36oLscHPbNkW9WqrQF/E8eyJyjVyldvp4/bqtEzX9wgwVlEYokZEqLr2Kn0FOKSYkbGRTcTJEV4iPqkq2mMIiI8NYk/hvtaCWCYcP1iCSfq1w2FIiFGka8n84jip5eLf3ndTIbHnqJxmkkS4+mhMGNQJjDvAgaUEyzZSBOEOdVZIR4gjrDUjRV1CZ8/hf+TVsVyDq3KmVOq1cEUBbAL9kAZOOAI1EAdNEATYKDADbgD98a1cWs8GE/T0RnjY2cLfIPx/A5VgJlp</latexit>

p̃(x)<latexit sha1_base64="p1C/CJ+wQdmYeV4t6vdtMoQG50E=">AAAB/nicdVDLSgMxFM34rPU1vlZugkWom2GmC7W7ggu7rGAf0BlKJpNpQzMPkoxYQ8FfceNCUbd+hzuX/omZVsHngcDhnHu5J8dPGRXStl+Nmdm5+YXFwlJxeWV1bd3c2GyJJOOYNHHCEt7xkSCMxqQpqWSkk3KCIp+Rtj88yf32BeGCJvG5HKXEi1A/piHFSGqpZ+64krKAqHRcdiMkB36oLscHPbNkW9WqrQF/E8eyJyjVyldvp4/bqtEzX9wgwVlEYokZEqLr2Kn0FOKSYkbGRTcTJEV4iPqkq2mMIiI8NYk/hvtaCWCYcP1iCSfq1w2FIiFGka8n84jip5eLf3ndTIbHnqJxmkkS4+mhMGNQJjDvAgaUEyzZSBOEOdVZIR4gjrDUjRV1CZ8/hf+TVsVyDq3KmVOq1cEUBbAL9kAZOOAI1EAdNEATYKDADbgD98a1cWs8GE/T0RnjY2cLfIPx/A5VgJlp</latexit>

p̃(x) can be computed in

<latexit sha1_base64="LU8MMfzuItV79rf23Ujj93Wyasc=">AAACB3icdVDLSgMxFM3UV62v+tgJEixCuxlmulC7K7iwOyvYB7SlZNJMG5rMDElGrNPZufFXXOhCEbf+gjuX/omZVsHngcDJOfdy7z1OwKhUlvVqpGZm5+YX0ouZpeWV1bXs+kZd+qHApIZ95oumgyRh1CM1RRUjzUAQxB1GGs7wKPEb50RI6ntnahSQDkd9j7oUI6WlbnbnJN/mSA0EjwKfjeL8ePJ13OgiHhcK3WzOMkslSwP+JrZpTZAr5y/fjm+3omo3+9Lu+TjkxFOYISlbthWoToSEopiRONMOJQkQHqI+aWnqIU5kJ5rcEcM9rfSg6wv9PAUn6teOCHEpR9zRlcmS8qeXiH95rVC5h52IekGoiIeng9yQQeXDJBTYo4JgxUaaICyo3hXiARIIKx1dRofweSn8n9SLpr1vFk/tXLkCpkiDbbAL8sAGB6AMKqAKagCDK3AD7sGDcW3cGY/G07Q0ZXz0bIJvMJ7fARymnSY=</latexit>

O(poly(|x|))<latexit sha1_base64="LU8MMfzuItV79rf23Ujj93Wyasc=">AAACB3icdVDLSgMxFM3UV62v+tgJEixCuxlmulC7K7iwOyvYB7SlZNJMG5rMDElGrNPZufFXXOhCEbf+gjuX/omZVsHngcDJOfdy7z1OwKhUlvVqpGZm5+YX0ouZpeWV1bXs+kZd+qHApIZ95oumgyRh1CM1RRUjzUAQxB1GGs7wKPEb50RI6ntnahSQDkd9j7oUI6WlbnbnJN/mSA0EjwKfjeL8ePJ13OgiHhcK3WzOMkslSwP+JrZpTZAr5y/fjm+3omo3+9Lu+TjkxFOYISlbthWoToSEopiRONMOJQkQHqI+aWnqIU5kJ5rcEcM9rfSg6wv9PAUn6teOCHEpR9zRlcmS8qeXiH95rVC5h52IekGoiIeng9yQQeXDJBTYo4JgxUaaICyo3hXiARIIKx1dRofweSn8n9SLpr1vFk/tXLkCpkiDbbAL8sAGB6AMKqAKagCDK3AD7sGDcW3cGY/G07Q0ZXz0bIJvMJ7fARymnSY=</latexit>

O(poly(|x|)).Their support can be (and can only be) anything in P.

Why is it bad that ELNCP models can't decide all languages in P?Because then they can't choose among continuations of a prompt. That is, there's no way to ensure that p(x#y) > 0 iff y is a valid continuation of prompt x, even if that property can be checked in polytime.

P/poly:P with the help of poly-sized advice strings that can come from an oracle. P/poly is therefore more powerful than P — they can model undecidable problems due to the oracle access!Efficiently Computable with Compact Parameters (ECCP):is a generalization of ECs. Similar to ELNCPs, ECCPs is a conceptual upper bound to the just-train-a-slightly-larger-model paradigm of EBMs.

Fix #1: use EBMsEBMs do not suffer the sequence order problem because they don't even try to compute the possibly expensive factors

<latexit sha1_base64="TFIS7YZ9AHwsnDyJoylO9PMZ3kk=">AAACBHicdVDJSgNBEO2JW4xbXG65NAYhXsJMBBPBQ8CDOUYwCyRh6On0JE16FrprJHHIwYu/4kVEEa9+hDeP/omdRMH1QTeP96qoqueEgiswzVcjMTe/sLiUXE6trK6tb6Q3t+oqiCRlNRqIQDYdopjgPqsBB8GaoWTEcwRrOIOTid+4YFLxwD+HUcg6Hun53OWUgJbsdCbMDW3AbY939Ueg77jxcGzHxzDet9NZM39UKh6UCvg3sfLmFNly7vLt9G4nrtrpl3Y3oJHHfKCCKNWyzBA6MZHAqWDjVDtSLCR0QHqspalPPKY68fSIMd7TShe7gdTPBzxVv3bExFNq5Dm6crKn+ulNxL+8VgRuqRNzP4yA+XQ2yI0EhgBPEsFdLhkFMdKEUMn1rpj2iSQUdG4pHcLnpfh/Ui/krcN84czKlitohiTKoF2UQxYqojKqoCqqIYqu0A26Rw/GtXFrPBpPs9KE8dGzjb7BeH4HXLqboA==</latexit>

p(xt | x<t)<latexit sha1_base64="TFIS7YZ9AHwsnDyJoylO9PMZ3kk=">AAACBHicdVDJSgNBEO2JW4xbXG65NAYhXsJMBBPBQ8CDOUYwCyRh6On0JE16FrprJHHIwYu/4kVEEa9+hDeP/omdRMH1QTeP96qoqueEgiswzVcjMTe/sLiUXE6trK6tb6Q3t+oqiCRlNRqIQDYdopjgPqsBB8GaoWTEcwRrOIOTid+4YFLxwD+HUcg6Hun53OWUgJbsdCbMDW3AbY939Ueg77jxcGzHxzDet9NZM39UKh6UCvg3sfLmFNly7vLt9G4nrtrpl3Y3oJHHfKCCKNWyzBA6MZHAqWDjVDtSLCR0QHqspalPPKY68fSIMd7TShe7gdTPBzxVv3bExFNq5Dm6crKn+ulNxL+8VgRuqRNzP4yA+XQ2yI0EhgBPEsFdLhkFMdKEUMn1rpj2iSQUdG4pHcLnpfh/Ui/krcN84czKlitohiTKoF2UQxYqojKqoCqqIYqu0A26Rw/GtXFrPBpPs9KE8dGzjb7BeH4HXLqboA==</latexit>

p(xt | x<t)!Downside: it is not easy to sample from EBMs. Training them requires estimating the partition function.

Fix #2: marginalizeA Lightly marginalized ELNCP model marginalizes over an ELNCP language (lightly so because it des not have too many latent variables). The sequence of latent and observed symbols can be sampled from the ELNCP model. Intuitively, they avoid the sequence order problem with latent variables:

<latexit sha1_base64="cnUusvnC0rSWoZlcD26LF3r8bqg=">AAADXnicrZLNattAEMdXUpukTt3Y7aXQQ5eaQnwxcgpNcygEeskxhTgJeI1ZrVbOktWu2B2ZGKGn6Rv1VnLJo3Rkt6EfyqHQAcFoZuc3/xkmKbTyEMffgjB69Hhre+dJZ/dp99ler//83NvSCTkRVlt3mXAvtTJyAgq0vCyc5Hmi5UVy/anJXyyl88qaM1gVcpbzhVGZEhwwNO8HUOyz0qTSJY4LWbGcw1WSVTd1Pa+YLxMPXFxXDOQNVIWzyM1rxjb/0gibKrOo65oN2iirNoq3umya32MEN6lKOUjkDOlH2q6olaWxysA9acmd4ihxLWhImUgtPID71wEpy1VK/4swOpz3BvHo6ChGo38741G8tsHxWb/7+su75em895WlVpQ5MoXm3k/HcQGzijtQAqkdVnpZYHe+kFN0Dc+ln1Xr86jpW4ykNLMOPwN0Hf21ouK596s8wZfNSP7PXBNsy01LyD7MKmWKEocVm0ZZqSlY2twaTZWTAvQKHS6cQq1UXHFcHuBFdnAJPyelDzvnB6Px+9HBZ9zGCdnYDnlF3pB9MiaH5JickFMyISK4DYOwE+6Gd9FW1I32Nk/D4EfNC/KbRS+/A22vJYU=</latexit>

p( x|{z}problemencoding

# y|{z}solutioncandidate

) = p( y|{z}latent

variable

#) · p( x|{z}problemencoding

| y|{z}latent

variable

#)<latexit sha1_base64="cnUusvnC0rSWoZlcD26LF3r8bqg=">AAADXnicrZLNattAEMdXUpukTt3Y7aXQQ5eaQnwxcgpNcygEeskxhTgJeI1ZrVbOktWu2B2ZGKGn6Rv1VnLJo3Rkt6EfyqHQAcFoZuc3/xkmKbTyEMffgjB69Hhre+dJZ/dp99ler//83NvSCTkRVlt3mXAvtTJyAgq0vCyc5Hmi5UVy/anJXyyl88qaM1gVcpbzhVGZEhwwNO8HUOyz0qTSJY4LWbGcw1WSVTd1Pa+YLxMPXFxXDOQNVIWzyM1rxjb/0gibKrOo65oN2iirNoq3umya32MEN6lKOUjkDOlH2q6olaWxysA9acmd4ihxLWhImUgtPID71wEpy1VK/4swOpz3BvHo6ChGo38741G8tsHxWb/7+su75em895WlVpQ5MoXm3k/HcQGzijtQAqkdVnpZYHe+kFN0Dc+ln1Xr86jpW4ykNLMOPwN0Hf21ouK596s8wZfNSP7PXBNsy01LyD7MKmWKEocVm0ZZqSlY2twaTZWTAvQKHS6cQq1UXHFcHuBFdnAJPyelDzvnB6Px+9HBZ9zGCdnYDnlF3pB9MiaH5JickFMyISK4DYOwE+6Gd9FW1I32Nk/D4EfNC/KbRS+/A22vJYU=</latexit>

p( x|{z}problemencoding

# y|{z}solutioncandidate

) = p( y|{z}latent

variable

#) · p( x|{z}problemencoding

| y|{z}latent

variable

#)

They can have any language in NP/Poly as support!<latexit sha1_base64="0RwI1uWrLyXCzmQEdpaIBxL8eFk=">AAADMXicrVJNbxMxEPVugZZAadIeOWARIbUcok2RKD0gVeqBHovUpJXisPJ6vanVtb2yZ6OuVvtruHPhnyAuPYAQV/4EkwQqPtIDEiNZep6P92bGTopceYiiqyBcuXX7zura3da9++sPNtqdzaG3pRNyIGxu3VnCvcyVkQNQkMuzwkmuk1yeJheHs/jpVDqvrDmBqpBjzSdGZUpwQFfcCV4V26w0qXSJ40LWTHM4T7L6smnimvky8cDFRc1AXkJdOIu8umFscZdG2FSZSdM0O/QlxXQdXzNUDWXK0OGbpw1drlEt08g5SAPXElPuFEdRlGBdukOZSC3cwPevPVOmVUr/U2dxuxv19vcjNPo36PeiuXUPTjrrj94+mx7H7Q8staLUyCly7v2oHxUwrrkDJZC1xUovC1TnEzlCaLiWflzPX7yhT9CT0sw6PAbo3PtrRc2195VOMHM2kv8zNnMui41KyF6Ma2WKEocVC6GszClYOvs+NFVOCsgrBFw4hb1Scc5xeYCfrIVL+DkpvRkMd3v9573d17iNI7KwNfKQPCbbpE/2yAE5IsdkQETwLvgYfAo+h+/Dq/BL+HWRGgY/arbIbxZ++w6MjhZY</latexit>

p( x|{z}problemencoding

) =X

y2V ⇤

p( y|{z}latent

variable

#) · p( x|{z}problemencoding

| y|{z}latent

variable

#)<latexit sha1_base64="0RwI1uWrLyXCzmQEdpaIBxL8eFk=">AAADMXicrVJNbxMxEPVugZZAadIeOWARIbUcok2RKD0gVeqBHovUpJXisPJ6vanVtb2yZ6OuVvtruHPhnyAuPYAQV/4EkwQqPtIDEiNZep6P92bGTopceYiiqyBcuXX7zura3da9++sPNtqdzaG3pRNyIGxu3VnCvcyVkQNQkMuzwkmuk1yeJheHs/jpVDqvrDmBqpBjzSdGZUpwQFfcCV4V26w0qXSJ40LWTHM4T7L6smnimvky8cDFRc1AXkJdOIu8umFscZdG2FSZSdM0O/QlxXQdXzNUDWXK0OGbpw1drlEt08g5SAPXElPuFEdRlGBdukOZSC3cwPevPVOmVUr/U2dxuxv19vcjNPo36PeiuXUPTjrrj94+mx7H7Q8staLUyCly7v2oHxUwrrkDJZC1xUovC1TnEzlCaLiWflzPX7yhT9CT0sw6PAbo3PtrRc2195VOMHM2kv8zNnMui41KyF6Ma2WKEocVC6GszClYOvs+NFVOCsgrBFw4hb1Scc5xeYCfrIVL+DkpvRkMd3v9573d17iNI7KwNfKQPCbbpE/2yAE5IsdkQETwLvgYfAo+h+/Dq/BL+HWRGgY/arbIbxZ++w6MjhZY</latexit>

p( x|{z}problemencoding

) =X

y2V ⇤

p( y|{z}latent

variable

#) · p( x|{z}problemencoding

| y|{z}latent

variable

#)

Downside: marginalization is required even at test time.

Fix #3: memorize anything we needWe can model anything if we have a big big database!Examples: kNNLM, adaptive semi parametric language models, …Downside: Need a vast database of observed or precomputed answers.

NP-complete (NPC):is a set of languages that are widely believed to be outside P/poly (and therefore cannot be support of ECCP languages)

P(V ⇤)

"

The space of unweighted languages. Each rectangular outline corresponds to a complexity class and encloses the languages whose decision problems fall into that class. Each shape (whose name is colored to match the shape outline) corresponds to a model family and encloses the languages that can be expressed as the support of some weighted language in that family.

Future work:Average-case analysis?Are there model families that have all the good stuff but none of the bad stuff?

Chu-Cheng Lin*♯, Aaron Jaech♭, Xin Li♯, Matt Gormley♮, Jason Eisner♯♯Johns Hopkins University

♭Facebook AI♮Carnegie Mellon University

Model family Compactparameters?

E�cientscoring?

E�cient samplingand normalization? Support can be . . .

ELN/ELNCP: Autoregressive models (§3.1) 3 3 3 some but not all ! 2 PEC/ECCP: Energy-based models (§4.1) 3 3 7 all ! 2 P but no ! 2 NPCLightly marginalized ELNCP: Latent-variable autoregressive models (§4.2) 3 7 3 all ! 2 NPLookup models (§4.3) 7 3 3 anything

*[email protected]