I've benchmarked SQLite on my laptop at just north of 20,000 queries (SELECTs) per second for the workload I was interested in at the time.
However, the limit depends heavily on your workload. If your working set (the data that all the simultaneous queries require) fits in memory then you'll be limited by memory bandwidth or the language you're calling from: GC stalls and query preparation can really hammer the throughput; if you're doing any actual work with the results of the query then I'd hope that time spent there would dominate the data access.
If your data doesn't fit in memory then you'll be limited by the speed of your disk system. If you have a fixed working set then you can increase the performance by adding RAM until the working set fits again.
If you have a streaming workload then you'll never be able to fit it in RAM (by definition) so you'll be limited by the speed of the IO systems involved; either network or disk.
If things are still too busy then stuff will start to queue up. This will cause each individual process to run slower as it'll be contending with the others. This, in turn will cause more incoming work to queue up. At some point the machine will run out of some other resource (memory for incoming connections, network sockets, space in queues, etc). How the system responds to that depends on how it's configured but, traditionally, your system is no longer providing service.
However, the limit depends heavily on your workload. If your working set (the data that all the simultaneous queries require) fits in memory then you'll be limited by memory bandwidth or the language you're calling from: GC stalls and query preparation can really hammer the throughput; if you're doing any actual work with the results of the query then I'd hope that time spent there would dominate the data access.
If your data doesn't fit in memory then you'll be limited by the speed of your disk system. If you have a fixed working set then you can increase the performance by adding RAM until the working set fits again.
If you have a streaming workload then you'll never be able to fit it in RAM (by definition) so you'll be limited by the speed of the IO systems involved; either network or disk.
If things are still too busy then stuff will start to queue up. This will cause each individual process to run slower as it'll be contending with the others. This, in turn will cause more incoming work to queue up. At some point the machine will run out of some other resource (memory for incoming connections, network sockets, space in queues, etc). How the system responds to that depends on how it's configured but, traditionally, your system is no longer providing service.