I am trying to understand the difference between the optimization problem types which are basically smooth and non smooth. I also found this question what does a smooth curve mean? I understand that smoothness is a relative concept from this. My question is, in optimization problems we call Least Squares cost function a smooth problem , it only belongs to the set $C^2$, i.e only twice differentiable and L1 regularization $\lambda \sum_{i=1}^n \theta_i$ as non-smooth which belongs to $C^1$ since it is differentiable once.( I am assuming $L1 \in C^1$,since $\partial L1/\partial \theta_i = 1 )$. So what does it mean to call a function smooth in the perspective of optimization.
I am guessing may be in optimization, we either use the gradient or hessian to compute the descent direction, so if any curve is twice differentiable is called as smooth function. Am I correct? And also I am not sure why L1 is called non smooth.