Welcome!


News

  • [2017.11] One paper accepted by AAAI 2018. Project Page
  • [2017.10] Submitted a paper to ICLR 2018.
  • [2017.09] Submitted a paper to AAAI 2018.
  • [2017.09] Attend EMNLP 2017. Travel Report (draft)
  • [2017.06] One paper accepted by EMNLP 2017
  • [2017.05] One paper accepted by KDD 2017
  • [2017.04] Submitted a paper to EMNLP 2017
  • [2017.02] Submitted a paper to KDD 2017

Blogs

  • 14 Nov 2017


    Bugs & Pitfalls

    PyTorch LSTM The LSTM takes [Seq_len * Batch_size * Hidden_size] as input, but embedding usually outputs [Batch_size * Seq_len * Hidden_size] The output of LSTM is output, (h, c), where (h, c) is a tuple. So, we should use it as output, _ = self.word_lstm(x) Read more!

  • 16 Sep 2017


    Tools and Environment Setup

    I found lots of tools and apps and tricks quite useful. Most of them are really well-known. Still I tried to list all free ones I knew, just in case someone happens to ignore one or two in the past. Autojump: a cd commands that learns, extremely useful, everyone loves... Read more!

  • 05 Sep 2017


    EMNLP 2017 Report

    I would attend the EMNLP 2017 conference and present our Relation Extraction Paper. Also I’m planning to write a EMNLP 2017 report (still a draft). Now let’s quickly go-through some interesting papers of this EMNLP conference. RotatedWord Vector Representations and their Interpretability Basic Idea: rotate embedding vectors into a new... Read more!

  • 05 Sep 2017


    Hello World!

    As idea can only be generated from ideas, I finally set up this blog. Although it’s almost my first time to try Git Page, it’s supervisingly easy to use. So I would encourage every one to share their thoughts through this cite. For me, to be honest, I’m kind of... Read more!