Standard machine learning methods assume that: (1) training data are abundant, and available at all times, (2) computational power is sufficient and (3) memory is not constrained. These conditions are often not met in real-life conditions. For instance, use case specific data are often scarce, arrive sequentially, and are non-stationary. Novelty needs to be integrated continuously, while preserving the previously learned knowledge. In these cases, continual learning (CL) methods need to be deployed. This course will focus on continual learning for image classification, and will discuss: (1) the main continual learning challenges (catastrophic forgetting, drift, stability-plasticity compromise, role of memory, scalability), (2) the main families of methods proposed to solve the task, (3) the relation to related areas (transfer learning, few-shot learning, edge learning), (4) the deployment of CL in practice, with examples of applications, and (5) good practices for CL evaluation.