Resource and useful tips on Transfer Learning in NLP












1












$begingroup$


I have a few label data for training and testing a DNN. Main purpose of my work is to train a model which can do a binary classification of text. And for this purpose, I have around 3000 label data and 60000 unlabeled data available to me. My data type is related to instructions (like- open the door[label-1], give me a cup of water[label-1], give me money[label-0] etc.) In this case, I heard that Transferring knowledge from other models will help me a lot. Can anyone give me some useful resource for transfer learning in NLP domain?



I already did a few experiments. I used GLoVE as a pretrained embeddings. Then test it with my label data. But got around 70% accuracy. Also tried with embedding built using my own data (63k) and then train the model. Got 75% accuracy on the test data. My model architecture is given below-
enter image description here



Q1: I have a quick question will it be referred to as Transfer learning if I use GLOVE embeddings in model?



Any kind of help is welcomed. Even someone has other ideas for building a model without using transfer learning is welcomed.










share|improve this question









$endgroup$




bumped to the homepage by Community 4 mins ago


This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.




















    1












    $begingroup$


    I have a few label data for training and testing a DNN. Main purpose of my work is to train a model which can do a binary classification of text. And for this purpose, I have around 3000 label data and 60000 unlabeled data available to me. My data type is related to instructions (like- open the door[label-1], give me a cup of water[label-1], give me money[label-0] etc.) In this case, I heard that Transferring knowledge from other models will help me a lot. Can anyone give me some useful resource for transfer learning in NLP domain?



    I already did a few experiments. I used GLoVE as a pretrained embeddings. Then test it with my label data. But got around 70% accuracy. Also tried with embedding built using my own data (63k) and then train the model. Got 75% accuracy on the test data. My model architecture is given below-
    enter image description here



    Q1: I have a quick question will it be referred to as Transfer learning if I use GLOVE embeddings in model?



    Any kind of help is welcomed. Even someone has other ideas for building a model without using transfer learning is welcomed.










    share|improve this question









    $endgroup$




    bumped to the homepage by Community 4 mins ago


    This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.


















      1












      1








      1





      $begingroup$


      I have a few label data for training and testing a DNN. Main purpose of my work is to train a model which can do a binary classification of text. And for this purpose, I have around 3000 label data and 60000 unlabeled data available to me. My data type is related to instructions (like- open the door[label-1], give me a cup of water[label-1], give me money[label-0] etc.) In this case, I heard that Transferring knowledge from other models will help me a lot. Can anyone give me some useful resource for transfer learning in NLP domain?



      I already did a few experiments. I used GLoVE as a pretrained embeddings. Then test it with my label data. But got around 70% accuracy. Also tried with embedding built using my own data (63k) and then train the model. Got 75% accuracy on the test data. My model architecture is given below-
      enter image description here



      Q1: I have a quick question will it be referred to as Transfer learning if I use GLOVE embeddings in model?



      Any kind of help is welcomed. Even someone has other ideas for building a model without using transfer learning is welcomed.










      share|improve this question









      $endgroup$




      I have a few label data for training and testing a DNN. Main purpose of my work is to train a model which can do a binary classification of text. And for this purpose, I have around 3000 label data and 60000 unlabeled data available to me. My data type is related to instructions (like- open the door[label-1], give me a cup of water[label-1], give me money[label-0] etc.) In this case, I heard that Transferring knowledge from other models will help me a lot. Can anyone give me some useful resource for transfer learning in NLP domain?



      I already did a few experiments. I used GLoVE as a pretrained embeddings. Then test it with my label data. But got around 70% accuracy. Also tried with embedding built using my own data (63k) and then train the model. Got 75% accuracy on the test data. My model architecture is given below-
      enter image description here



      Q1: I have a quick question will it be referred to as Transfer learning if I use GLOVE embeddings in model?



      Any kind of help is welcomed. Even someone has other ideas for building a model without using transfer learning is welcomed.







      deep-learning nlp convnet word-embeddings transfer-learning






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Aug 20 '18 at 5:12









      faysalfaysal

      61




      61





      bumped to the homepage by Community 4 mins ago


      This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.







      bumped to the homepage by Community 4 mins ago


      This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
























          1 Answer
          1






          active

          oldest

          votes


















          0












          $begingroup$

          If you use pre-trained models on data that are distinct from the data they were originally trained on, it's transfer learning. Your two-class sentence corpus is distinct from the data that the GloVe embeddings were generated on, so this could be considered a form of transfer learning. This might be a helpful explainer for general ideas around pre-training (and why it's a worthy pursuit).



          Recent work in the NLP transfer learning space that I'm aware of is ULMFiT by Howard and Ruder of fast.ai, here's the paper if you prefer that. OpenAI also has recent work extending the Transformer model with a unsupervised pre-training, task specific fine-tuning approach.



          As for your task, I think it might be helpful to explore research around sentence classification rather than digging deeply into transfer learning. For your purposes, it seems that embeddings are a means to have a reasonable representation of your data rather than prove that Common Crawl (or some other dataset) extends to your corpus.



          Hope that helps, good luck!






          share|improve this answer









          $endgroup$













          • $begingroup$
            Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
            $endgroup$
            – faysal
            Aug 26 '18 at 20:33










          • $begingroup$
            I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
            $endgroup$
            – tm1212
            Aug 27 '18 at 15:10












          Your Answer








          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "557"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f37167%2fresource-and-useful-tips-on-transfer-learning-in-nlp%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0












          $begingroup$

          If you use pre-trained models on data that are distinct from the data they were originally trained on, it's transfer learning. Your two-class sentence corpus is distinct from the data that the GloVe embeddings were generated on, so this could be considered a form of transfer learning. This might be a helpful explainer for general ideas around pre-training (and why it's a worthy pursuit).



          Recent work in the NLP transfer learning space that I'm aware of is ULMFiT by Howard and Ruder of fast.ai, here's the paper if you prefer that. OpenAI also has recent work extending the Transformer model with a unsupervised pre-training, task specific fine-tuning approach.



          As for your task, I think it might be helpful to explore research around sentence classification rather than digging deeply into transfer learning. For your purposes, it seems that embeddings are a means to have a reasonable representation of your data rather than prove that Common Crawl (or some other dataset) extends to your corpus.



          Hope that helps, good luck!






          share|improve this answer









          $endgroup$













          • $begingroup$
            Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
            $endgroup$
            – faysal
            Aug 26 '18 at 20:33










          • $begingroup$
            I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
            $endgroup$
            – tm1212
            Aug 27 '18 at 15:10
















          0












          $begingroup$

          If you use pre-trained models on data that are distinct from the data they were originally trained on, it's transfer learning. Your two-class sentence corpus is distinct from the data that the GloVe embeddings were generated on, so this could be considered a form of transfer learning. This might be a helpful explainer for general ideas around pre-training (and why it's a worthy pursuit).



          Recent work in the NLP transfer learning space that I'm aware of is ULMFiT by Howard and Ruder of fast.ai, here's the paper if you prefer that. OpenAI also has recent work extending the Transformer model with a unsupervised pre-training, task specific fine-tuning approach.



          As for your task, I think it might be helpful to explore research around sentence classification rather than digging deeply into transfer learning. For your purposes, it seems that embeddings are a means to have a reasonable representation of your data rather than prove that Common Crawl (or some other dataset) extends to your corpus.



          Hope that helps, good luck!






          share|improve this answer









          $endgroup$













          • $begingroup$
            Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
            $endgroup$
            – faysal
            Aug 26 '18 at 20:33










          • $begingroup$
            I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
            $endgroup$
            – tm1212
            Aug 27 '18 at 15:10














          0












          0








          0





          $begingroup$

          If you use pre-trained models on data that are distinct from the data they were originally trained on, it's transfer learning. Your two-class sentence corpus is distinct from the data that the GloVe embeddings were generated on, so this could be considered a form of transfer learning. This might be a helpful explainer for general ideas around pre-training (and why it's a worthy pursuit).



          Recent work in the NLP transfer learning space that I'm aware of is ULMFiT by Howard and Ruder of fast.ai, here's the paper if you prefer that. OpenAI also has recent work extending the Transformer model with a unsupervised pre-training, task specific fine-tuning approach.



          As for your task, I think it might be helpful to explore research around sentence classification rather than digging deeply into transfer learning. For your purposes, it seems that embeddings are a means to have a reasonable representation of your data rather than prove that Common Crawl (or some other dataset) extends to your corpus.



          Hope that helps, good luck!






          share|improve this answer









          $endgroup$



          If you use pre-trained models on data that are distinct from the data they were originally trained on, it's transfer learning. Your two-class sentence corpus is distinct from the data that the GloVe embeddings were generated on, so this could be considered a form of transfer learning. This might be a helpful explainer for general ideas around pre-training (and why it's a worthy pursuit).



          Recent work in the NLP transfer learning space that I'm aware of is ULMFiT by Howard and Ruder of fast.ai, here's the paper if you prefer that. OpenAI also has recent work extending the Transformer model with a unsupervised pre-training, task specific fine-tuning approach.



          As for your task, I think it might be helpful to explore research around sentence classification rather than digging deeply into transfer learning. For your purposes, it seems that embeddings are a means to have a reasonable representation of your data rather than prove that Common Crawl (or some other dataset) extends to your corpus.



          Hope that helps, good luck!







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Aug 20 '18 at 15:14









          tm1212tm1212

          46517




          46517












          • $begingroup$
            Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
            $endgroup$
            – faysal
            Aug 26 '18 at 20:33










          • $begingroup$
            I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
            $endgroup$
            – tm1212
            Aug 27 '18 at 15:10


















          • $begingroup$
            Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
            $endgroup$
            – faysal
            Aug 26 '18 at 20:33










          • $begingroup$
            I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
            $endgroup$
            – tm1212
            Aug 27 '18 at 15:10
















          $begingroup$
          Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
          $endgroup$
          – faysal
          Aug 26 '18 at 20:33




          $begingroup$
          Your comments are really helpful. Yes I am aware of fast.ai. I was also thinking of using sentence classification without using transfer learning. Also, getting around 87~90% accuracy without using TL. Do you think it is reasonable accuracy in case of training 673606 params with around 3k label data?
          $endgroup$
          – faysal
          Aug 26 '18 at 20:33












          $begingroup$
          I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
          $endgroup$
          – tm1212
          Aug 27 '18 at 15:10




          $begingroup$
          I don't know enough about the data, difficulty of the task, or the final application of what you're working on to have a well-founded answer for you.
          $endgroup$
          – tm1212
          Aug 27 '18 at 15:10


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Data Science Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f37167%2fresource-and-useful-tips-on-transfer-learning-in-nlp%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Ponta tanko

          Tantalo (mitologio)

          Erzsébet Schaár