From 158bb99d2b2c40414a40dfe97d6907dd11ea195c Mon Sep 17 00:00:00 2001 From: Haoqin Tu Date: Tue, 27 Sep 2022 19:21:31 +0800 Subject: [PATCH] add pcae to semi-sup method --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 099ca0b..3b0b40c 100644 --- a/README.md +++ b/README.md @@ -68,6 +68,7 @@ List format follows: 1. **ICML (Monash)** / [Variational Autoencoder with Disentanglement Priors for Low-Resource Task-Specific Natural Language Generation](https://arxiv.org/abs/2202.13363) / **G2T**, BERT encoder for overall feature extraction and two different MLP encoder for label and content encoding severally. Used prefix-tuning and GPT-2 decoder for zero/few-shot style transfer generation. / Nan 2. **Arxiv (Stanford)** / [Diffusion-LM Improves Controllable Text Generation](https://arxiv.org/abs/2205.14217) / **K2T**, syntactic control over continuous difussion language model in continuous word embedding space (as the latent space and optimized in VAE paradigm) with Plug and Play component. / [Code](https://github.com/XiangLi1999/Diffusion-LM) 3. **ICML (UCLA)** / [Latent Diffusion Energy-Based Model for Interpretable Text Modeling](https://arxiv.org/pdf/2206.05895.pdf) / **G2T**, use diffusion process on latent space with prior sampling with EBM, variational bayes for latent posterior approximation. Similar paradigm of [S-VAE](https://arxiv.org/pdf/1406.5298.pdf) to deal with labels in semi-supervision. / [Code](https://github.com/yuPeiyu98/LDEBM) +4. **KBS (Tsinghua)** / [PCAE: A Framework of Plug-in Conditional Auto-Encoder for Controllable Text Generation](https://www.sciencedirect.com/science/article/pii/S0950705122008942) / **G2T**, invent *Broadcasting Net* to repeatly add control signals into latent space to create a concentrate and manipulable latent space in VAE. Experimenced on both RNN and BART VAE models. / [Code](https://github.com/ImKeTT/pcae) ### 2021