어텐션 기반의 지속학습에서 정규화값 제어 방법

논문상세정보
' 어텐션 기반의 지속학습에서 정규화값 제어 방법' 의 주제별 논문영향력
논문영향력 선정 방법
논문영향력 요약
주제
  • Catastrophic Forgetting
  • Continuous Learning
  • Knowledge Transfer
  • Variable Lambda
동일주제 총논문수 논문피인용 총횟수 주제별 논문영향력의 평균
37 0

0.0%

' 어텐션 기반의 지속학습에서 정규화값 제어 방법' 의 참고문헌

  • Re-evaluating continual learning scenarios: A categorization and case for strong baselines
  • Paying more attention to attention : Improving the performance of convolutional neural networks via attention transfer
  • Overcoming catastrophic forgetting in neural networks
  • Lifelong learning with dynamically expandable networks,”
  • Learning without forgetting
    Z. Li [2017]
  • Knowledge transfer via distillation of activation boundaries formed by hidden neurons
    B. Heo [2019]
  • Distilling the knowledge in a neural network
    G. Hinton [2014]
  • Continual lifelong learning with neural networks: A review
  • Continual learning with deep generative replay
  • Continual learning through synaptic intelligence
    F. Zenke [2017]
  • Catastrophic interference is eliminated in pretrained networks
    K. McRae [1993]
  • Catastrophic forgetting in connectionist networks