I think your question is excellently phrased. The answer for anything data science-y is "no." The bottleneck will be transferring the input data onto the quantum CPU.
For algorithms like HHL that have superclassical performance, a complex superposition encoding the data needs to be created first. This state is subsequently "consumed" by the algorithm. The no-cloning theorem forbids creating copies of the encoded state, and hence the encoding step needs to be repeated every time the algorithm is run.
For another example, consider Grover's search that is sub-linear in calls to an oracle function. If the oracle references a linear array of data, for example, it needs to work on superpositions of array indices. In other words, the entire dataset needs to fit in "quantum" memory.
Using a quantum cpu can only be sensible for computationally difficult problems where the hard problem instances can be specified by a relatively small number of bits.
I dont super follow this area, so i might be totally off base, but i think lots of the hopes for that sort of thing was based around the HHL algorithm, but then Tang showed that normal computers can be just as fast doing that problem, so now its up in the air a bit how applicable wuantum computers are
Reading "Ewin Tang is a female" instead of "Ewin Tang is a woman" is what made me look into it, really. It seemed a deliberate choice and I wondered if it was related to a gender change from woman to man. Of course, that's completely irrelevant to the subject of quantum computing.
There are some basic linear algebra subroutines (Matrix inversion, finding eigenvalues & eigenvectors) that can be performed with an exponential speedup on a quantum computer in theory, that's why there is so much interest in Quantum Machine Learning. If you are asking about the current hardware level, then no, current quantum computers can not solve any practical problem faster than a classical computer.
In theory classical computers are also not limited to have better solutions to problems where quantum computers claim superiority like prime factorisation or like Tang's quantum inspired classical algorithm that beats HHL for low rank matrices.
> There are some basic linear algebra subroutines (Matrix inversion, finding eigenvalues & eigenvectors) that can be performed with an exponential speedup on a quantum computer in theory
Eh... those particular subroutines have polynomial time algorithms already on a classical computer.
You can't exponentially speed up something that's polynomial time already.
(Though for the benefit of readers here, the list doesn't include any "basic linear algebra subroutines (Matrix inversion, finding eigenvalues & eigenvectors)").
The "linear systems" and "machine learning" algorithm paragraphs under "Optimization, Numerics, & Machine Learning" reference a number of resources in regards to currently understood limits of and applications for quantum computers and linear optimization.