Natural Language Processing

  • 4.4
Approx. 32 hours to complete

Course Summary

Learn the basics of natural language processing and explore how to use Python libraries for text analysis and manipulation.

Key Learning Points

  • Understand the fundamentals of natural language processing
  • Learn how to clean and preprocess text data
  • Practice using Python libraries for text analysis and manipulation

Related Topics for further study


Learning Outcomes

  • Understand the basics of natural language processing
  • Learn how to use Python libraries for text analysis and manipulation
  • Be able to clean and preprocess text data for analysis

Prerequisites or good to have knowledge before taking this course

  • Basic knowledge of Python programming
  • Understanding of basic statistics and machine learning concepts

Course Difficulty Level

Intermediate

Course Format

  • Online
  • Self-paced

Similar Courses

  • Applied Natural Language Processing
  • Text Mining and Analytics

Related Education Paths


Notable People in This Field

  • Jacob Perkins
  • Dan Jurafsky

Related Books

Description

This course covers a wide range of tasks in Natural Language Processing from basic to advanced: sentiment analysis, summarization, dialogue state tracking, to name a few. Upon completing, you will be able to recognize NLP tasks in your day-to-day work, propose approaches, and judge what techniques are likely to work well. The final project is devoted to one of the most hot topics in today’s NLP. You will build your own conversational chat-bot that will assist with search on StackOverflow website. The project will be based on practical assignments of the course, that will give you hands-on experience with such tasks as text classification, named entities recognition, and duplicates detection.

Outline

  • Intro and text classification
  • About the University
  • About this course
  • Welcome video
  • Main approaches in NLP
  • Brief overview of the next weeks
  • [Optional] Linguistic knowledge in NLP
  • Text preprocessing
  • Feature extraction from text
  • Linear models for sentiment analysis
  • Hashing trick in spam filtering
  • Neural networks for words
  • Neural networks for characters
  • About University
  • Rules on the academic integrity in the course
  • Prerequisites check-list
  • Hardware for the course
  • Getting started with practical assignments
  • Classical text mining
  • Simple neural networks for text
  • Language modeling and sequence tagging
  • Count! N-gram language models
  • Perplexity: is our model surprised with a real text?
  • Smoothing: what if we see new n-grams?
  • Hidden Markov Models
  • Viterbi algorithm: what are the most probable tags?
  • MEMMs, CRFs and other sequential models for Named Entity Recognition
  • Neural Language Models
  • Whether you need to predict a next word or a label - LSTM is here to help!
  • Perplexity computation
  • Probabilities of tag sequences in HMMs
  • Language modeling
  • Sequence tagging with probabilistic models
  • Vector Space Models of Semantics
  • Distributional semantics: bee and honey vs. bee an bumblebee
  • Explicit and implicit matrix factorization
  • Word2vec and doc2vec (and how to evaluate them)
  • Word analogies without magic: king – man + woman != queen
  • Why words? From character to sentence embeddings
  • Topic modeling: a way to navigate through text collections
  • How to train PLSA?
  • The zoo of topic models
  • Word and sentence embeddings
  • Topic Models
  • Sequence to sequence tasks
  • Introduction to Machine Translation
  • Noisy channel: said in English, received in French
  • Word Alignment Models
  • Encoder-decoder architecture
  • Attention mechanism
  • How to deal with a vocabulary?
  • How to implement a conversational chat-bot?
  • Sequence to sequence learning: one-size fits all?
  • Get to the point! Summarization with pointer-generator networks
  • Introduction to machine translation
  • Encoder-decoder architectures
  • Summarization and simplification
  • Dialog systems
  • Task-oriented dialog systems
  • Intent classifier and slot tagger (NLU)
  • Adding context to NLU
  • Adding lexicon to NLU
  • State tracking in DM
  • Policy optimisation in DM
  • Final remarks
  • Instructions for Telegram bot hosted on AWS [Optional]
  • Papers mentioned in week 5
  • Keep up-to-date with NLP research
  • Task-oriented dialog systems

Summary of User Reviews

Pros from User Reviews

  • Well-structured lectures and assignments
  • Engaging and knowledgeable instructors
  • Practical exercises and projects that reinforce learning
  • Real-world applications and case studies
  • Access to a supportive and active online community

Cons from User Reviews

  • Some students found the course to be too technical or challenging
  • Limited opportunities for personalized feedback
  • Not enough emphasis on certain topics like machine learning
  • Some technical issues reported with the online platform
  • Course material may become outdated quickly due to the fast-paced nature of the field
English
Available now
Approx. 32 hours to complete
Anna Potapenko, Alexey Zobnin, Anna Kozlova, Sergey Yudin, Зимовнов Андрей Вадимович
HSE University
Coursera

Instructor

Anna Potapenko

  • 4.4 Raiting
Share
Saved Course list
Cancel
Get Course Update
Computer Courses