Linux kernel mirror (for testing) git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git
kernel os linux

ASoC: SOF: Intel: hda-pcm: Place the constraint on period time instead of buffer time

Instead of constraining the ALSA buffer time to be double of the firmware
host buffer size, it is better to set it for the period time.
This will implicitly constrain the buffer time to a safe value
(num_periods is at least 2) and prohibits applications to set smaller
period size than what will be covered by the initial DMA burst.

Fixes: fe76d2e75a6d ("ASoC: SOF: Intel: hda-pcm: Use dsp_max_burst_size_in_ms to place constraint")
Signed-off-by: Peter Ujfalusi <peter.ujfalusi@linux.intel.com>
Reviewed-by: Ranjani Sridharan <ranjani.sridharan@linux.intel.com>
Reviewed-by: Kai Vehmanen <kai.vehmanen@linux.intel.com>
Reviewed-by: Bard Liao <yung-chuan.liao@linux.intel.com>
Link: https://patch.msgid.link/20251002135752.2430-4-peter.ujfalusi@linux.intel.com
Signed-off-by: Mark Brown <broonie@kernel.org>

authored by

Peter Ujfalusi and committed by
Mark Brown
45ad27d9 3dcf683b

+21 -8
+21 -8
sound/soc/sof/intel/hda-pcm.c
··· 29 29 #define SDnFMT_BITS(x) ((x) << 4) 30 30 #define SDnFMT_CHAN(x) ((x) << 0) 31 31 32 + #define HDA_MAX_PERIOD_TIME_HEADROOM 10 33 + 32 34 static bool hda_always_enable_dmi_l1; 33 35 module_param_named(always_enable_dmi_l1, hda_always_enable_dmi_l1, bool, 0444); 34 36 MODULE_PARM_DESC(always_enable_dmi_l1, "SOF HDA always enable DMI l1"); ··· 293 291 * On playback start the DMA will transfer dsp_max_burst_size_in_ms 294 292 * amount of data in one initial burst to fill up the host DMA buffer. 295 293 * Consequent DMA burst sizes are shorter and their length can vary. 296 - * To make sure that userspace allocate large enough ALSA buffer we need 297 - * to place a constraint on the buffer time. 294 + * To avoid immediate xrun by the initial burst we need to place 295 + * constraint on the period size (via PERIOD_TIME) to cover the size of 296 + * the host buffer. 297 + * We need to add headroom of max 10ms as the firmware needs time to 298 + * settle to the 1ms pacing and initially it can run faster for few 299 + * internal periods. 298 300 * 299 301 * On capture the DMA will transfer 1ms chunks. 300 - * 301 - * Exact dsp_max_burst_size_in_ms constraint is racy, so set the 302 - * constraint to a minimum of 2x dsp_max_burst_size_in_ms. 303 302 */ 304 - if (spcm->stream[direction].dsp_max_burst_size_in_ms) 303 + if (spcm->stream[direction].dsp_max_burst_size_in_ms) { 304 + unsigned int period_time = spcm->stream[direction].dsp_max_burst_size_in_ms; 305 + 306 + /* 307 + * add headroom over the maximum burst size to cover the time 308 + * needed for the DMA pace to settle. 309 + * Limit the headroom time to HDA_MAX_PERIOD_TIME_HEADROOM 310 + */ 311 + period_time += min(period_time, HDA_MAX_PERIOD_TIME_HEADROOM); 312 + 305 313 snd_pcm_hw_constraint_minmax(substream->runtime, 306 - SNDRV_PCM_HW_PARAM_BUFFER_TIME, 307 - spcm->stream[direction].dsp_max_burst_size_in_ms * USEC_PER_MSEC * 2, 314 + SNDRV_PCM_HW_PARAM_PERIOD_TIME, 315 + period_time * USEC_PER_MSEC, 308 316 UINT_MAX); 317 + } 309 318 310 319 /* binding pcm substream to hda stream */ 311 320 substream->runtime->private_data = &dsp_stream->hstream;