I have an application that uses a EFR32MG13 (+19.5 dBm BT Tx output power) and has a good deal (like 9 dB) of RF attenuation between between the MG13 Tx output and the antenna. I used gecko_cmd_system_set_tx_power(100) to set the Tx output power to 10 dBm for BLE and BT5 transmissions. I also configured my gecko config structure using .rf.tx_gain = -90 to tell the SW about the 9 dB attenuation. Based on the article below, I would have expected to see 19 dBm of power coming out of the chip, so that it would be 10 dBm at the antenna input after the RF attenuation. But I was unable to get any more than about 14 dBm to come out of the chip. Does anyone know why I'm seeing only 14 dBm when I expect to see 19 dBm in this scenario?
Thanks for this feedback. Will run some tests to see if it works and get back to you!
OK, it looks like there was never really a problem. We didn't configure our spectrum analyzer properly to make power measurements. We are now able to successfully transmit up to 20 dBm from the chip, yielding 10 dBm into our antenna after 10 dB of attenuation. Sorry for the confusion.