The best intro book there is for data science methods in general, including linear regression, in Python is probably Data Science from Scratch by Joel Grus. This covers simple linear regression, multiple regression, and logistic regression, among other traditional methods, as well as a brief tour of the theory. The only disadvantage to this is that you are literally doing everything from scratch - I have heard that this book does not cover these methods using standard Python libraries, such as scikit-learn and pandas.
Another recommendation I would make is Real-World Machine Learning. My recollection is that this one covers machine learning methods using more standardized packages, rather than from scratch. This text isn't as theoretically driven as Grus' text.
If you're looking for something more mathematical focusing on linear regession as its own theory ("general linear models" are what they're called - do not confuse this with generalized linear models), I would recommend a traditional intro-Ph.D.-level statistics text, such as Plane Answers to Complex Questions. I've gotten to know this text very well since I started the Master's program I'm in, but I'm also aware that Agresti released a similar text very recently, and the original Linear Models text by Searle (a classic) has been updated with R and SAS code.
After going through this material on linear models - particularly Searle's text - you will be well-prepared to tackle Elements of Statistical Learning, a take on machine learning from a statistical perspective, a.k.a. "statistical learning." This text covers penalization methods, such as LASSO and Ridge regression.