The “period” is a term that corresponds to a fragment in the OSS world. The period defines the size at
which a PCM interrupt is generated. This size strongly depends on the hardware. Generally, the smaller
period size will give you more interrupts, that is, more controls. In the case of capture, this size defines
the input latency. On the other hand, the whole buffer size defines the output latency for the playback
direction
which a PCM interrupt is generated. This size strongly depends on the hardware. Generally, the smaller
period size will give you more interrupts, that is, more controls. In the case of capture, this size defines
the input latency. On the other hand, the whole buffer size defines the output latency for the playback
direction