I've recently been real fascinated by the topic of complexity and what keeps us from keeping software simple. The wider net likes to blame "lazy programmers" and "evil managers" for this, as if any software could, with sufficient time, be made as simple as "hello world". I've instead been looking at how various factors create complexity "pressure". Code that needs to satisfy a physical constraint is more likely to be complex than code that doesn't, etc.
One complexity pressure is "impedance": when the problem you are solving isn't well suited for the means you have to solve it. For example, if you need to write really fast software, then Python will be too slow. You can get around this by using foreign function interface, as scientific libraries do, or running multiple processes, as webdevs do, but these are solutions you might not need if you were using a faster language in the first place. In a sense impedance is complexity that comes from using "the wrong tool for the job."
Saying that python is the "wrong tool" for data science is a little inflammatory. It might have impedance flaws, but it also has a lot going for it— rapid prototyping, a huge community, a large ecosystem, etc. Surely those matter more than the added complexity of slowness!
More broadly, "use the right tool for the job" directly contradicts the best practice of choose boring technology: