Over the past five years there has been an explosion of research the deep learning field. Companies like Google, Facebook, and OpenAI have created neural networks that are significantly more accurate and more versatile than their counterparts of a decade ago. However, biological applications of these techniques have lagged behind. In this talk we will explore how modern neural network architectures can be applied to a diverse set of biological problems. As a use-case we will explore the HIV protein Tat, the transactivator of transcription. Utilizing techniques from natural language processing and computer visions we will predict its biological functions, interacting partners, and potential future sequences. We will also discuss how to use techniques like transfer learning to take advantage of small datasets and how to use our biological knowledge to avoid common pitfalls.
Learning Objectives:
1. Recognize the components of an RCNN-based predictor
2. Understand how to use transfer learning to take advantage of mixed data
3. Recognize the components of a GAN
4. Understand how GANs can be used to generate sequences with particular properties