I think we'd need an RF engineer to answer this more thoroughly, but it's my understanding that there are a lot of materials capable of absorbing the low-energy microwaves that it uses as its transmission medium.
Water is a big one, which I think is one reason that having my phone in my back pocket almost always results in a dropped connection to my headphones over the course of working outside for a few hours, compared to my side or front jacket pocket. Too much water in our tissues.
Hi! RF Engineer here. Bluetooth sucks for a number of reasons, only a few of which actually relate to RF.
First, we're bad at naming things. Is it BLE? Bluetooth Smart? BTLE? Bluetooth 4.0, or Bluetooth 4 Low Energy? And what does this have to do with ANT+? This doesn't seem to be a big problem, but it's a sign of deeper problems with the spec. In the long run it makes it very difficult for developers to implement.
Bluetooth 2.0 is radically different from BTLE. In 2.0, you had one device which acts like a microphone, and another device which acts like a speaker. This was like getting two people to dance together, it takes a lot of practice to get the joined movements right. If one person stumbles, it all goes down like dominoes.
BTLE on the other hand, acts like a (more familiar) client server model. You come up to the bank teller, and have a limited set of operations you can preform (withdraw/deposit money, open/close account). Here's the catch though... the teller is only open for 10 seconds during a 24 hour day, and moves throughout the city. You have to arrive at just the right time, or you have to wait until tomorrow. (This was done for battery saving.) The throughput is also MUCH slower than 2.0, which makes applications like audio out of the question for now.
There's also a lot of bureaucratic hype surrounding it. If you look at the release from the BT SIG, it seems very much similar to Java's claim that it runs on a bajillion devices. All in all, BTLE seems to be a solution in search of an IoT related problem, much like Java.
So in short, the reason I think it sucks is because it's a very complicated protocol with poor and confusing naming conventions. It sure doesn't help that it keeps getting re-invented! (Although BT5 seems to just be add-ons, finally!) Implementers (both on the HW side and the mobile/desktop side) have a difficult time figuring out how to do things correctly, much less why they need to be done. Things are getting better, but until the next "killer BTLE" application comes out, it's just heart rate monitors and useless iBeacons.
You worked on BT specs ? BT seems an acute case of design by comittee. It's huge, full of profiles and corners. Nobody implements it really well, everything moves before it's finished. Makes you dream of wires some times.
I was involved in the 1.0 Bluetooth specifications and can confirm it was "designed by committee". IIRC the original wireless specification came from either Nokia or Ericsson and they drove a lot of the initial development. They had already developed the RF technology before the Bluetooth SIG was even formed. The original stated goal for Bluetooth was "cable replacement". Features such as pairing a headset to a phone, or a phone to a laptop were an essential part of the original profiles.
A reason for the complexity was that the BT 1.0 profiles often leveraged existing technology, for example:
- RFCOMM was a way of sending arbitrary serial data, reusing RS232 comms which were very common.
- OBEX was a way of sending data which was previously sent over IrDA
- The "LAN access profile" basically said "use RFCOMM to do PPP over a serial link like you do with a modem"
If you tried to implement any of these from scratch then not only do you have to implement the BT part, you also have to implement the technologies that BT reused.
If you look at the initial SIG members, Nokia and Ericsson took care of the initial phone developments. Intel, IBM, Microsoft and Toshiba represented the PC side of things. I was working for 3Com at the time and we were interested in it as a short range network technology. 3Com developed a network device conforming to the "LAN access profile" but it was never released. 3Com also owned Palm and they were interested in incorporating BT with the hand held devices.
It is interesting to compare BT to Wireless Ethernet (IEEE 802.11) world. The IEEE Ethernet (802.3) specifications are pretty much only concerned with getting data packets from A to B at layers 0 through 2. At layer 3 and above they don't care if those packets are IPv4, IPv6 or some other protocol like IPX.
Bluetooth tried to define everything from the the RF communications all the way up to the application layer. The specification mentions how to the PIN code request should be presented to the user when authorizing a new connection. It also mentions which audio codecs should be supported for streaming audio. The BT profiles also tried to define how to transfer files, business cards or print documents.
These detailed application layer specifications simply don't exist in something like Ethernet. There might be an argument that BT tried to over specify things but it was attempting to give a level of interoperability which we still struggle to achieve over other networks.
Yeah, that's how it felt, they provided an out of the box full stack. Maybe this AND the field it lived in, cellphones as opposed to computers, made it too hard to do right. Cellphones changed a lot since the BT 1.0 days, instead of implementing gradual layers, you have to deliver a monolith..
As a user i kinda like the profiles, in particular the OBEX based ones. This because it means i can expect two devices to be able to share files out of the box if both of them have a bluetooth radio. Nothing like that exist for wifi, and i have to set up some client-server scheme to make anything happen.
The OBEX profiles predate even BT, they were inherited from IrDA. IrDA was an earlier method for transferring business cards or similar small files between hand held devices over an infra-red link link.
Not that I disagree with you, but I would love to see examples where it would be the opposite and you would say it was a beautiful and well implemented.
Tends to be hard to implement a specification without any implementation to test it out together with it, but those who do work out, should be looked into why so we all can learn from it.
I think Bluetooth is the worst spec ever. What we need is something that happened to ARMv8, you get a clean room implantation with everything you learned before, and you provide a compatibility layer much like the 32bit ARMv7.
It is a gigantic pile of mess and it never worked well enough for the majority of users.
Talking about design by committees, even the Wifi 802.11 has much improved.
I really wish Apple could design new one and force market adoption with the spec opened.
As long as you're within the communication range and in whatever your protocol calls a "reasonable" environment (e.g. not underwater), you're OK. The strength of the signals isn't chosen at random, it's chosen so that it allows devices to function well (within the design constraints -- range, environment conditions etc.) -- and so that it meets the power consumption constraints imposed by size, design, cost and technology.
tl;dr when someone decided to use a 4 dBm transceiver, they (should have) made a conscious choice about it and would have verified that it's enough.
Quite plainly, if the connection drops when it shouldn't, it's either bad software or bad design.
Bad software is either the Bluetooth stack (especially in legacy Bluetooth devices; newer stacks tend to fare better) or the firmware that talks to the Bluetooth stack (if it's BLE, this is the safer bet).
Bad design on the hardware level is less excusable than it would seem, because a Bluetooth device is not exactly a long-haul wireless link that has to work during the mother of all thunderstorms. There's a wealth of information and modeling tools that help you with this stuff, too. Bad antenna design, bad filtering, bad casing are all common culprits, but I've seen a lot of electronic engineers who wouldn't call themselves RF engineers get it right by just following common sense and doing the math.
Sometimes (but even less often), it's not a design problem, it's a manufacturing problem (e.g. PCB antennas that get damaged or don't get etched right. But this is pretty rare.
Certainly, the operating environment plays a role, but that's why you consider it while you're designing the whole gizmo. You don't (or shouldn't) get to shrug and say hey, it's not my fault we're basically walking cucumbers.
So if a pair of super high-end Bose speakers and a super trendy iPhone keep dropping connection while they're within range, not inside and, respectively, outside a Farady cage or whatever, the reason isn't the black magic that RF design is, it's Bose's and Apple's profit margin.
Edit: I do concur with the other fellow who posted here, Bluetooth really is complex and the band it operates in isn't exactly a charm to work with, but frankly, neither of these reasons account for the huge range of devices that are simply badly designed and/or run bad software. There are a lot of Bluetooth devices that work just fine, a lot of other protocols that operate in the 2.4 GHz band and work just fine, a lot of other protocols that operate in similarly noisy bands and work just fine and a lot of protocols that are more complex and work just fine.
Water is a big one, which I think is one reason that having my phone in my back pocket almost always results in a dropped connection to my headphones over the course of working outside for a few hours, compared to my side or front jacket pocket. Too much water in our tissues.