I pay my bills working as a "data scientist". Basically that means I do traditional data analytics, but I do it in wild-wild-west situations that aren't standard for a statistician. Maybe the data isn't in a form that traditional tools can read it, and I need to write a custom parser in some low-level language. Or the dataset might be too large to fit on one computer. Stuff like that. And my programming language of choice for this task is (almost) always Python.
My webinar goes over a lot of specific information, but it reminded me of one overarching theme. For any task X there is a tool that is better than Python. In some cases quite a bit better. But also for any task X, Python is good enough to do it. And that flexibility is the real win with Python.
The other cool thing that I talked a little bit about is how some of Python's numerical libraries work under-the-hood. They're mostly all based on NumPy arrays, which are super efficient to store and fast to do arithmetic with. These arrays are the data structures they use for machine learning, numerical computation and the like. I'd love to give a webinar called "Python Shedding its Skin" about how the language is implemented, but that's a talk for another day!