Remember Audience? The company's audio processors are used in a staggering number of mobile devices. In January, the latest eS3700 series debuted with always-on listening for voice-activated functions. Now, Audience is turning its attention to motion processing. The firm says its expertise the audio field is particularly well suited to tackling motion—something humans also use their ears to interpret.
Audience's first product in this arena is the MQ100 motion processor. This tiny 5.8 mm² chip performs two functions. First, it acts as a hub, collecting data from separate sensor hardware. The MQ100 supports inertial, magnetic, and environmental sensors, and it doesn't just gather raw data. It also performs sensor fusion, which interprets the data to derive meaningful information that can be passed along to the host.
Sensor fusion is typically performed in software on the application processor, which isn't terribly power efficient. The MQ100 handles this task in hardware, allowing the SoC to sleep. Audience says the motion processor consumes no more than five milliwatts when active, a claimed 10-20 mW less than competing motion processors.
Low-power motion processing has interesting implications for always-on gesture controls in addition to health and fitness monitoring. Audience says games and mapping software will benefit, too. I can't envision the need for background motion monitoring for gaming, but if the MQ100 relieves the CPU of sensor fusion duties, more CPU resources should be available to smooth out gameplay.
Along with its dedicated motion processor, Audience is rolling out a couple of motion-infused additions to its eS3700 audio processor lineup. These chips offer sensor hub functionality, allowing them to use orientation data to determine the optimal microphone configuration. The so-called MotionQ intelligence detects whether the handset is being held or sitting flat on a table when deciding whether to use the directional or 360° speakerphone modes, for example. There's no hardware acceleration involved, so the MotionQ processing must be handled by the SoC. It sounds like Audience has plans to combine motion and audio processing on a single chip, though.