Chronotropic Incompetence after Heart Transplantation Is Associated with Increased Mortality and Decreased Functional Capacity

Introduction: The contribution of chronotropic incompetence to reduced exercise tolerance after a heart transplant is well known, but its role as a prognostic marker of post-transplant mortality is unclear. The aim of this study is to examine the relationship between post-transplant heart rate response (HRR) and survival. Methods: We performed a retrospective analysis of all adult heart transplant recipients at the University of Pennsylvania between the years 2000 and 2011 who underwent a cardiopulmonary exercise test (CPET) within a year of transplant. Follow-up time and survival status were observed through October 2019, using data merged from the Penn Transplant Institute. HRR was calculated by subtracting the resting HR from the peak exercise HR. The association between HRR and mortality was analyzed using Cox proportional hazard models and Kaplan–Meier analysis. The optimal cut-off point for HRR was generated by Harrell’s C statistic. Patients with submaximal exercise tests were excluded, defined by a respiratory exchange ratio (RER) cut-off of 1.05. Results: Of 277 patients with CPETs performed within a year post-transplant, 67 were excluded for submaximal exercise. In the 210 included patients, the mean follow-up time was 10.9 years (Interquartile range (IQR) 7.8–14). Resting HR and peak HR did not significantly impact mortality after adjusting for covariates. In a multivariable linear regression analysis, each 10-beat increase in heart rate response was associated with a 1.3 mL/kg/min increase in peak VO2 and a 48 s increase in the total exercise time. Each beat/min increase in HRR was associated with a 3% reduction in the hazard of mortality (HR 0.97; 95% CI 0.96–0.99, p = 0.002). Using the optimal cut-off point generated by Harrell’s C statistic, survival was significantly higher in patients with an HRR > 35 beats/min compared to those with an HRR < 35 beats/min (log rank p = 0.0012). Conclusion: In heart transplant patients, a low HRR is associated with increased all-cause mortality and decreased exercise capacity. Additional studies are needed to validate whether targeting HRR in cardiac rehabilitation may improve outcomes.

Abstract: Introduction: The contribution of chronotropic incompetence to reduced exercise tolerance after a heart transplant is well known, but its role as a prognostic marker of post-transplant mortality is unclear. The aim of this study is to examine the relationship between post-transplant heart rate response (HRR) and survival. Methods: We performed a retrospective analysis of all adult heart transplant recipients at the University of Pennsylvania between the years 2000 and 2011 who underwent a cardiopulmonary exercise test (CPET) within a year of transplant. Follow-up time and survival status were observed through October 2019, using data merged from the Penn Transplant Institute. HRR was calculated by subtracting the resting HR from the peak exercise HR. The association between HRR and mortality was analyzed using Cox proportional hazard models and Kaplan-Meier analysis. The optimal cut-off point for HRR was generated by Harrell's C statistic. Patients with submaximal exercise tests were excluded, defined by a respiratory exchange ratio (RER) cut-off of 1.05. Results: Of 277 patients with CPETs performed within a year post-transplant, 67 were excluded for submaximal exercise. In the 210 included patients, the mean follow-up time was 10.9 years (Interquartile range (IQR) 7. [8][9][10][11][12][13][14]. Resting HR and peak HR did not significantly impact mortality after adjusting for covariates. In a multivariable linear regression analysis, each 10-beat increase in heart rate response was associated with a 1.3 mL/kg/min increase in peak V O 2 and a 48 s increase in the total exercise time. Each beat/min increase in HRR was associated with a 3% reduction in the hazard of mortality (HR 0.97; 95% CI 0.96-0.99, p = 0.002). Using the optimal cut-off point generated by Harrell's C statistic, survival was significantly higher in patients with an HRR > 35 beats/min compared to those with an HRR < 35 beats/min (log rank p = 0.0012). Conclusion: In heart transplant patients, a low HRR is associated with increased all-cause mortality and decreased exercise capacity. Additional studies are needed to validate whether targeting HRR in cardiac rehabilitation may improve outcomes.

Introduction
Orthotopic heart transplantation (OHT) is a well-established treatment for patients with advanced heart failure, associated with significant improvements in both survival and quality of life [1]. While post-transplant exercise capacity is increased compared to pretransplant, it remains lower than normal for age-matched controls [2]. One of the driving forces behind this phenomenon is an abnormal chronotropic response to exercise, known as chronotropic incompetence (CI), due in part to the cardiac denervation that occurs during heart transplantation. CI is defined as the heart's inability to appropriately increase its rate to increased activity or demand, although a standardized definition is lacking [3]. Notably, the widely utilized formula described by Astrand et al. for age-predicted maximal heart rate (APMHR = 220-age) was derived from a healthy population with wide confidence intervals around this estimation [4]. Thus, it is unlikely to accurately reflect the maximal predicted heart rate (HR) in heart transplant patients with autonomic denervation. An alternative approach is to assess chronotropic response by comparing peak HR at maximal exertion during the exercise test to the resting heart rate, otherwise known as heart rate response (HRR). HRR allows for the assessment of the dynamic HR range, taking baseline HR into account rather than relying on peak HR alone. While chronotropic incompetence after heart transplantation is expected and is known to affect exercise capacity, it is unclear how HRR in the transplant population affects mortality [5,6]. In a previous study, our group demonstrated a positive association between post-transplant peak oxygen consumption (peak V O 2 ) and long term survival [7]. The aim of this study is to evaluate the association between post-transplant heart rate response and survival, and to describe the association of heart rate response with exercise capacity.

Design and Participants
We performed a retrospective analysis of all patients aged ≥18 years who underwent heart transplantation between 2000 and 2011 at the University of Pennsylvania. Survival data were collected through October 2019 using data merged from the Penn Transplant Institute. Patients who underwent re-transplantation or multi-organ transplant were excluded. This era was selected due to a clinical protocol at the time in which all patients underwent a post-transplant cardiopulmonary exercise test (CPET) within one year of transplant if clinically able. Patients who underwent CPET outside of their first posttransplant year were excluded. Patients with permanent pacemakers were also excluded. Pre-and post-transplant clinical data, including baseline demographics, CPET results, medications, laboratory values at the time of CPET, and post-transplant echocardiogram, were obtained from patients' electronic medical records. This study was approved by the University of Pennsylvania Institutional Review Board.

Cardiopulmonary Exercise Metrics
All patients performed a symptom-limited treadmill CPET according to our standard clinical practice, up to the maximal volitional effort, as described in previous studies. Studies were interpreted by an advanced heart failure cardiologist. Prior to exercise, resting ECG and vital signs were recorded, including heart rate, as well as baseline respiratory mechanics and maximal voluntary ventilation (MVV). Breath-by-breath expired gases were obtained to estimate minute ventilation (V E ), carbon dioxide production (V CO 2 ), and V O 2 , which were displayed as 10 s averages. Vital signs were obtained multiple times throughout the CPET, including at peak exercise. Peak V O 2 and the respiratory exchange ratio (RER) were determined as the highest 10-s averaged samples obtained during the exercise test. The RER was determined from the ratio of V CO 2 to V O 2 . The maximal volitional exercise was determined using a peak RER ≥ 1.05, based on guideline recommendation [8,9]. O 2 pulse was calculated as the ratio of unindexed V O 2 to heart rate. The V-slope method was used to determine the ventilatory threshold (VT). Ventilatory equivalents for carbon dioxide (V E V CO 2 ) were measured at VT and at peak exercise, and end-tidal P CO 2 was measured at peak exercise (PET CO 2 ). Breathing reserve was calculated as the difference between MVV and maximum V E as a percent of MVV (i.e., [MVV − V EMax ]/MVV × 100%). HRR was calculated by subtracting the resting HR from the peak HR. Heart rate reserve (which is distinct from the heart rate response, HRR) was calculated by dividing HRR by the difference between resting HR and APMHR according to the following formula: Heart Rate Reserve = ([peak HR − resting HR]/[APMHR − resting HR] × 100) [8]. Only studies demonstrating evidence of maximal effort were used in the analyses. The first post-transplant transthoracic echocardiogram obtained within one year of the CPET was used to compare peak V O 2 to the left ventricular ejection fraction.

Statistical Analysis
Continuous variables were presented as mean ± standard deviation (SD) or median with interquartile range (IQR) for skewed data. Categorical data were expressed as frequency and proportions and compared using Fisher's exact test. Baseline characteristics in patients who had an HRR above and below the sample median were compared using the Student's t-test or the Wilcox rank-sum, as appropriate.
To evaluate the association between HRR and all-cause mortality, Kaplan-Meier survival curves were generated for the two groups based on HRR above and below an optimal cut-off point generated by Harrell's C statistic. The survival endpoint was survival free from re-transplantation. The log rank test was used to compare the survival between the two groups. The relationship between HRR and all-cause mortality at follow-up was assessed using a Cox proportional hazards model, where HRR was entered as a continuous variable. The model was adjusted for covariates measured prior to CPET, including age, gender, race, BMI, beta-blocker use, hemoglobin at the time of CPET, a history of chronic obstructive pulmonary disease (COPD), and post-transplant ejection fraction (EF). Similar models were developed to examine the relationship between peak HR and mortality. Schoenfeld residuals were used to test the proportional hazard assumption of the Cox model.
To examine the relationship between HRR and peak V O 2 , we fit a multivariable linear regression model with HRR as the independent variable, adjusting for the same clinical and laboratory covariates as above. We repeated this method to assess the relationship between HRR and exercise time. All tests were considered significant at a two-sided alpha level < 0.05. The rate of missing data for each covariate did not exceed 15%. Missing values for the covariates in each model were considered missing at random and filled in via multiple imputations with chained equations based on twenty imputations using Rubin's combination rules. All analyses were performed in Stata version 15 (College Station, TX, USA, StataCorp LLC).

Results
A total of 418 patients underwent heart transplantation at our institution. Of these patients, 277 were included after removing 115 patients with no CPET, 23 with CPET > 1-year post-transplant, and three patients who underwent re-transplantation. Among these 277 patients, 67 had submaximal CPETs and were excluded from the analysis. In the 210 patients with a maximal effort CPET within one year of a first-time heart transplant, the mean follow-up time was 10.9 years with an interquartile range of 7.8 and 14 years.
Baseline demographics and CPET parameters in patients above and below an HRR of 35 beats/min are displayed in Table 1. Patients with a lower HRR were more likely to have COPD (12% vs. 2% p = 0.011) but otherwise had similar distributions of age, race, gender, and other pre-transplant comorbidities. Notably, there was no difference in post-transplant EF and beta-blocker use at the time of CPET. Several differences were present between the two groups in regard to CPET metrics. Patients with an HRR above 35 beats/min had a higher peak heart rate (110.3 vs. 134.9 bpm, p < 0.001), higher peak systolic blood pressure (147 vs. 157 mmHg (p < 0.001), higher peak V O 2 (14.5 vs. 17.4 mL/kg/min, p < 0.001), longer total exercise time (7.7 vs. 9.3 min, p < 0.001), higher maximum voluntary ventilation (102.6 vs. 113.4 L, p = 0.011), higher heart rate reserve (27 vs. 59%, p < 0.001), and a lower V E /V CO 2 (40.5 vs. 37.9 p = 0.008).

Heart Rate Response and Survival
Of the 210 patients in this analysis, a total of 88 patients died over the course of followup over 18 years. HRR was independently associated with mortality before and after adjustment for covariates (Table 2). Each beat/min increase in HRR was associated with a 3% reduction in the hazard of mortality (HR 0.97; 95% CI 0.96-0.99, p = 0.002). In Figure 1, the Kaplan-Meier curve demonstrates a significantly increased survival in patients with an HRR ≥ 35 beats/min compared to those with an HRR < 35 beats/min across 15 years of follow-up. The 10-year survival rate was 64% among those in the lower HRR group and 82% in the higher HRR group. Although peak heart rate was associated with decreased mortality in the univariable analysis, it was no longer associated with mortality when adjusted for covariates. Figure 2 demonstrates a Kaplan-Meier survival curve stratified by peak HR above and below 140 beats/min. Resting heart rate was not associated with mortality in our cohort, even when including patients who had submaximal effort.

Heart Rate Response and Survival
Of the 210 patients in this analysis, a total of 88 patients died over the course of follow-up over 18 years. HRR was independently associated with mortality before and after adjustment for covariates ( Table 2). Each beat/min increase in HRR was associated with a 3% reduction in the hazard of mortality (HR 0.97; 95% CI 0.96-0.99, p = 0.002). In Figure 1, the Kaplan-Meier curve demonstrates a significantly increased survival in patients with an HRR ≥ 35 beats/min compared to those with an HRR < 35 beats/min across 15 years of follow-up. The 10-year survival rate was 64% among those in the lower HRR group and 82% in the higher HRR group. Although peak heart rate was associated with decreased mortality in the univariable analysis, it was no longer associated with mortality when adjusted for covariates. Figure 2 demonstrates a Kaplan-Meier survival curve stratified by peak HR above and below 140 beats/min. Resting heart rate was not associated with mortality in our cohort, even when including patients who had submaximal effort.   Of the 210 patients in this analysis, a total of 88 patients died over the course of follow-up over 18 years. HRR was independently associated with mortality before and after adjustment for covariates ( Table 2). Each beat/min increase in HRR was associated with a 3% reduction in the hazard of mortality (HR 0.97; 95% CI 0.96-0.99, p = 0.002). In Figure 1, the Kaplan-Meier curve demonstrates a significantly increased survival in patients with an HRR ≥ 35 beats/min compared to those with an HRR < 35 beats/min across 15 years of follow-up. The 10-year survival rate was 64% among those in the lower HRR group and 82% in the higher HRR group. Although peak heart rate was associated with decreased mortality in the univariable analysis, it was no longer associated with mortality when adjusted for covariates. Figure 2 demonstrates a Kaplan-Meier survival curve stratified by peak HR above and below 140 beats/min. Resting heart rate was not associated with mortality in our cohort, even when including patients who had submaximal effort.    Linear regression models were constructed to examine the relationship between HRR, peak V O 2 , and treadmill time (Figures 3 and 4). In a multivariable linear regression analysis, each 10-beat increase in heart rate response was associated with a 1.3 mL/kg/min increase in peak V O 2 and a 48 s increase in the total exercise time. Linear regression models were constructed to examine the relationship between HRR, peak VO2, and treadmill time (Figures 3 and 4). In a multivariable linear regression analysis, each 10-beat increase in heart rate response was associated with a 1.3 mL/kg/min increase in peak VO2 and a 48 s increase in the total exercise time.

Discussion
Following a heart transplant, patients commonly have a persistently lower exercise capacity when compared with age-matched controls [10]. CI is a key contributor to this reduced exercise capacity [11][12][13]. However, the association between CI and mortality in this population has not been established, in part because no clear definition of chrono-

Discussion
Following a heart transplant, patients commonly have a persistently lower exercise capacity when compared with age-matched controls [10]. CI is a key contributor to this reduced exercise capacity [11][12][13]. However, the association between CI and mortality in this population has not been established, in part because no clear definition of chronotropic incompetence exists in patients after heart transplantation. The traditional formula of age-predicted maximal heart rate is unlikely to apply to these patients with a denervated orthotopic heart [4]. The HRR (i.e., the difference between peak HR at maximal exertion to resting heart rate) may better reflect CI in patients after OHT. In this study, we observed a significant association between higher HRR and longer post-transplant survival free from re-transplantation. Using the endpoint of survival, we then derived a threshold for HRR that optimized the calibration of the survival model.
To our knowledge, this is the first study that has demonstrated the relationship between HRR and long-term post-transplant mortality. In our large cohort of OHT patients, we found that an attenuated heart rate response to exercise is associated with increased all-cause mortality over more than a decade of follow-up. More specifically, a heart rate response of less than 35 beats/min provides a useful threshold that optimizes the prediction of all-cause mortality compared. The association between HRR and survival held true when adjusting for demographic factors and relevant comorbidities. The differences in HRR in our cohort were primarily attributed to differences in peak heart rate given the similar baseline resting heart rates. Notably, resting heart rate was not associated with the risk of mortality, even when including patients who had submaximal exercise stress tests. This is contrary to some of the current literature [14,15]. This may be due to the different time points when post-transplant resting heart rates were evaluated.
It is well known that chronotropic incompetence is associated with increased all-cause and cardiovascular mortality in heart failure and healthy individuals [8,[16][17][18]. This has not been thoroughly explored in post-transplant population, where the denervated heart responds differently and may not reflect any underlying structural changes in the heart predictive of survival. Our study suggests that poor heart rate response with exercise is also associated with increased mortality in heart transplant patients.
The underlying mechanisms to explain how a higher HRR contributes to improved survival in heart transplant patients is unclear but is likely a multifactorial phenomenon. The natural history of HT patients involves the gradual autonomic reinnervation of the cardiac allograft, which leads to improved HRR to exercise [19]. The significant variability of sympathetic reinnervation from patient to patient may be contributing. HRR may contribute to a reduction in mortality as a marker of exercise capacity and overall physical fitness. Heart rate response correlates with exercise capacity in the normal population, and the degree of physical fitness has been shown to be predictive of cardiac risk and mortality [3,20]. In our cohort, we demonstrated that HRR is associated with both peak V O 2 and total exercise time, as also seen in other studies [12]. Given the strong relationship between HRR and peak V O 2 , it is possible that skeletal muscle metabolic abnormalities, pre-transplant sarcopenia and frailty, and deconditioning may persist and contribute to overall exercise limitation [21].
Our findings have important clinical implications. First, this study demonstrates that the assessment of HRR carries important prognostic value in the heart transplant population. Importantly, HRR is a potentially modifiable risk factor that may be improved by endurance and exercise training. Additional studies are needed to elucidate the mechanism of the increased mortality risk in patients with a lower HRR. Lastly, HRR may be a more accessible and inexpensive tool for risk stratification in place of peak V O 2 , which was recently found to be a predictor of mortality in the heart transplant population [7].

Limitations
This is a single-center study, which limits its generalizability. The cut-off values for heart rate response were based on an optimization algorithm and will need to be validated in other data sets. Patients who did not undergo CPET were excluded from our study, which introduces selection bias as these individuals are more likely to have died without undergoing CPET or be too ill to undergo CPET. Patients without CPET had higher mortality than patients with CPETs; thus, our study likely underestimates the overall mortality of our OHT cohort. Although we included only patients who performed a CPET within a year, the time at which each patient obtained a CPET varied. This variability may be subject to other confounding factors, such as incomplete recovery from surgery, less time to recover from pretransplant sarcopenia, and frailty. In an attempt to limit such confounding, we only included patients with a respiratory exchange ratio (RER) greater than 1.05 in the study and did not include patients with CPETs performed more than one-year post-transplant. However, residual confounding cannot be entirely ruled out. This study is also limited by its retrospective nature, which increases the possibility of comorbidity misclassification.

Conclusions
In heart transplant patients, a low HRR is associated with increased all-cause mortality. HRR may be a more accessible and inexpensive tool for risk stratification when peak V O 2 cannot be obtained. Future studies are needed to evaluate whether the reduced HRR is only a sign of frailty and lack of fitness and whether strategies can be implemented to improve the long-term outcomes of this patient population.

Institutional Review Board Statement:
The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board (or Ethics Committee) of University of Pennsylvania (protocol code 849008).

Informed Consent Statement:
Ethical review and approval were waived for this study. This is a retrospective study of preexisting clinical data and a signed informed consent document is not feasible, as many patients are no longer be available to obtain authorization for use or disclosure of protected health information (PHI). Furthermore, a consent form will increase the risk of potential harm from a breach of confidentiality by becoming a "link" stored to identify the subject.

Data Availability Statement:
The data that support the findings are available on request from the first author, R.S.Z.

Conflicts of Interest:
The authors declare no conflict of interest.

Abbreviations
CPET cardiopulmonary exercise test HTR heart transplant recipients HRR heart rate response CI chronotropic incompetence APMHR age-predicted maximum heart rate OHT orthotopic heart transplantation MVV maximum voluntary ventilation PET CO 2 peak end-tidal carbon dioxide RER respiratory exchange ratio V CO 2 carbon dioxide production V E minute ventilation V O 2 oxygen consumption VT ventilatory threshold