Page 32 - 2020_11-Haematologica-web
P. 32

T. de Witte et al.
high-risk MDS was observed (HR 0.88, CI: 0.63-1.22; P=0.44). Exploration of the relationship between ESA treat- ment and pre-treatment transfusion status revealed a larger estimated effect of ESA on survival amongst patients who had not received RBCT prior to starting ESA treatment (HR 0.71, 95%CI: 0.49-1.03; P=0.07) than amongst patients who had received prior RBCT (HR 0.93, 95%CI: 0.70-1.26; P=0.67).
Responding patients had a better prognosis, in terms of a lower risk for death (HR 0.65, 95%CI: 0.45-0.893; P=0.018). The effect of response on time to first post-ESA treatment transfusion was significant after stratification by pre-treat- ment transfusion experience. Importantly, and irrespective of response status, patients who received RBCT before starting ESA had a shorter time to their first post-treatment transfusion (median 6.1 vs. 23.3 months for non-transfused patients; HR 2.4, 95%CI: 1.75-3.31; P<10-4).
This large observational study showed that the response rate to ESA, as well as the capacity of these agents to signif- icantly delay the onset of a regular RBCT need, is most pro- nounced in RBCT-naïve patients, suggesting that RBCT- naïve patients are more responsive. These results identify early initiation of ESA treatment as a relevant treatment response indicator, and suggest that ESA should be recom- mended as first-line treatment in LR-MDS patients with symptomatic anemia before starting regular RBCT.
Labile plasma iron levels and non-transferrin bound iron are early and clinically relevant indicators of iron toxicity and impact of iron chelation therapy on outcome in patients with lower-risk myelodysplastic syndromes receiving red blood cell transfusion
The majority of LR-MDS patients become RBCT dependent over time. With an expected survival of up to 12 years, these patients are prone to long-term accumulation of iron due to RBCT.33 Iron overload may also occur in a fraction of MDS patients who do not receive RBCT, result- ing from the stimulation of intestinal iron absorption, medi- ated through suppression of hepcidin production by inef- fective erythropoiesis.34 The toxic effects of iron overload in other iron loading diseases are well known, but the conse- quences in MDS remain to be elucidated. To this end, we evaluated erythroid marrow activity, hepcidin levels, and body iron status, including non-transferrin bound iron (NTBI) and labile plasma iron (LPI) levels over time in LR- MDS patients and their relation with disease subtype and RBCT history within the EUMDS Registry.35
Detectable NTBI already occurred in all patient groups at registration, with highest levels in patients with MDS and ring sideroblasts (MDS-RS). The median LPI levels were below the level of detection in all patient groups at registra- tion, except in transfusion dependent (TD) MDS-RS patients.35 Hepcidin levels increased with the number of transfused units, but in contrast, hepcidin levels significant- ly decreased over time in transfusion independent (TI) MDS-RS patients. Serum transferrin (sTfR) levels increased significantly over time in both TI and TD MDS-RS patients (P-values from 0.01 to <10-3). Both elevated NTBI and LPI levels showed a threshold effect with transferrin saturation (TSAT) rates of >70% and >80%, respectively. Elevated LPI levels occurred almost exclusively in patients with MDS-RS and/or patients, who had received RBCT. Once LPI levels are increased, survival time decreases, with greatest impact in patients who are TD (adjusted HR, 4.03, 95%CI: 0.95- 17.06; P=0.06).
This study among LR-MDS patients showed that both treatment with RBCT and presence of ring sideroblasts increased the occurrence of the toxic iron species NTBI and LPI in serum. These data suggest that body iron accumula- tion and toxic iron species (NTBI and LPI) occur mainly in MDS-RS patients along the axis of ineffective erythro- poiesis, characterized by elevated sTfR, low hepcidin, and increased iron levels, in some MDS subtypes, irrespective of receiving RBCT. Transfusional parenchymal iron overload, reflected by the combination of high serum ferritin levels, as well as direct iron toxicity, reflected by the presence of NTBI and LPI, was noted more frequently in MDS patients with ring sideroblasts compared to patients without ring sideroblasts. These data show that elevated LPI levels were associated with decreased survival both in the overall pop- ulation of this study and in the patient groups subdivided by RBCT status. This implies that the widely used param- eter TSAT cannot serve as a parameter to predict survival; however, TSAT rates can be used as a pre-screening marker to identify patients who are at risk of developing elevated LPI levels and associated poor prognosis. Finally, we could demonstrate in a limited number of patients treated with iron chelators that LPI levels decreased below detectable levels. This study suggested that NTBI and LPI may serve as early indicators of iron toxicity and as a measure for the effectiveness of iron chelation therapy in patients with lower risk MDS.
Iron overload due to RBCT is associated with increased morbidity and mortality in patients with LR-MDS.36 Several studies have reported beneficial effects of ICT on survival and other clinical outcomes in MDS patients with iron over- load.37,38 However, valid data on the effect of ICT are limited since most studies are executed in small, selected patient groups or suffer from serious methodological problems such as confounding by indication.38 Performing a random- ized, controlled trial (RCT) for this research question is awkward, and patients included in RCT may not reflect the general LR-MDS patients, who are usually patients of advanced age with multiple chronic, complex comorbidi- ties. In addition to the possible beneficial effects of ICT on survival, increasing evidence indicates hematologic improvement in patients during ICT.39 Following improve- ment in cytopenias, transfusion independency is achieved in a minority of chelated patients.40,41
Results from a study conducted within the EUMDS reg- istry on 490 non-chelated and 199 chelated patients using ICT as a time-dependent variable showed that the hazard ratio for OS was 0.50 (95%CI: 0.34-0.74) after adjusting for relevant confounding factors. Restriction of the analysis to 150 patients who were initially treated with deferasirox resulted in the adjusted HR for OS of 0.38 (95%CI: 0.24- 0.60), while patients who were initially treated with defer- oxamine had inferior OS compared to deferasirox treated patients (adjusted HR: 2.46, 95%CI: 1.12-5.41). The propensity-score analysis matching for all relevant vari- ables, and a multivariate Cox proportional hazard model restricted to the deferasirox treated patients resulted in the adjusted HR for OS of 0.34 (95%CI: 0.22-0.53). An ery- throid response occurred in 77 chelated patients: 61 patients had a reduction in transfusion density, and 16 patients who did not have a reduction in transfusion density became transfusion independent during at least one visit interval.
The TELESTO trial42 is the only prospective, randomized, placebo-controlled study of ICT in MDS patients compar- ing deferasirox with a placebo-control group. This study
2520
haematologica | 2020; 105(11)


































































































   30   31   32   33   34