On a custom board based on the nrf52 and ARM mbedOS I am using custom touch sensors. The touch sensors are using a TI msp430fr and captivate code for the recognition of touch events. There is a total of 6 sensors (each with its own msp430fr) connected via I2C to the nrf52 host.
I am currently reading the sensor data output from the touch sensor application on the nrf52 which works quite well. With this small project I would like to use the media player example from TI to identify certain touch gestures / touch scenarios. (see [login to view URL]).
I need you to change the touch library on the mbedOS based nrf52 project to allow reading the I2C output / key representations (to verify the output) from the msp430fr based on the example following the link to identify gestures and use them to play/pause/mute/control volume and switch tracks. These are “keys” provided by the TI example mentioned.
Note: The I2C communication including interrupt handling is already established and the audio playback is also already developed and working so the task is mainly reading the msp430fr interface from the nrf52 and using the “keys” to further control the actions on the nrf52.