Mixing studies have traditionally been used to differentiate factor deficiencies vs. inhibitors. To kick-start, we will provide a historical overview of how mixing studies were instrumental for the discovery of coagulation factors.
Although this test has been used for decades, it lacks standardization due to the variable sensitivity of PT and APTT reagents to factor deficiencies and non-specific inhibitors, such as drugs and lupus anticoagulants (LA). We will discuss advantages and disadvantages of recommended methods to interpret mixing studies (correction into the reference range, percentage correction, the Rosner index, and the estimated factor correction). We will compare the accuracy of these cut-off methods by providing examples of single factor deficiency vs. conditions associated with multiple factor deficiency. Lastly, we will discuss the use of mixing studies to detect inhibitors, including LAs. We will then dive into the debate of incorporating mixing studies in LA algorithms and the need for standardization of LA interpretation guidelines will be illustrated with interesting cases.
Objectives