Long Term Outcomes of Kidney Transplant: Characteristics of Recipients with 20 or More Years of Graft Survival

Objective: Kidney transplant survival in the first year after transplantation has significantly improved, although long-term results are less encouraging. In recent years the pressure on the scientific community on the need to refine methods to discover the possible factors that can predict graft survival after 10, 20 years or more after transplantation is increased. Few previous studies have evaluated patient and laboratory characteristics associated with optimal long-term graft survival. The objective of this study was to identify possible factors associated with the survival of the transplanted kidney in the very long term. Methods: We retrospectively studied adults who had received first-time, single kidney transplants between 1967 and 1991 at S. Orsola Hospital in Kidney Transplant Centre in Bologna. We compared the clinical, immunological, and laboratory profile of patients whose grafts were still functioning ≥ 20 years after kidney transplantation to those whose transplants survived <20 years. Results: We identified 111 patients (24.5%) who received transplants with a functioning graft for 20 or more years after transplantation. Female gender, living donor, younger donor age, shorter delayed graft function duration (DGF), lower one-year creatinine and higher one-year eGFR predicted ≥ 20-year functional graft survival in the univariate analyses. In the multivariate analysis, only female gender, shorter DGF duration and 1-year creatinine and eGFR remained as significant predictors of graft survival ≥ 20 years. Conclusions: To our knowledge, this is the first report of 20-year graft survival being associated with one-year renal function. Accordingly, efforts should be targeted to preserving graft function in the first year after kidney transplantation. In addition we have identified a population of long-term kidney transplants survivors who will be the subject of further studies in order to clarify mechanism of immunological tolerance to transplant.


Introduction
For more than 40 years, kidney transplantation has been the treatment of choice for chronic renal failure. Recent studies have demonstrated significant improvements in graft survival in the first year after transplantation because of significant advances in immunosuppressive therapy, with a consequent reduced incidence of acute rejection [1]. However, no similar improvement in long-term graft survival has been observed [1][2][3][4][5]. Nevertheless, a few studies have identified an emerging population of patients whose graft has survived for more than 20 years; these have been called ultra-long-term survival kidney transplant recipients (ULS). The number of studies examining ULS is limited because of the difficulty in following representative kidney transplant recipients for a prolonged period. In their retrospective analysis of the Irish Renal Transplant Database, Traynor et al. [6] reported that 21.3% of 1174 transplants functioned for 20 years or more. These authors found that older recipient age, male gender, presence of acute rejection, and deceased donor transplant were factors significantly associated with long-term graft loss. However, because of limitations of this database, some important clinical features were not assessed. Bererhi et al. [7] analyzed the clinical and immunological features of 56 patients with a first renal transplant that functioned for more than 30 years. However, living donors and young recipients and donors were highly represented in their study population. Because of this and the small sample size, the results of this study are primarily descriptive and do not have widespread applicability.
Characterization of the clinical, immunological, and laboratory profile of patients who exhibit an extremely favorable transplant outcome could facilitate early intervention to reduce risk factors in order to prevent graft failure. The aim of our study is thus to compare the clinical, immunological, and laboratory profiles of kidney transplant recipients with a transplant functioning for 20 years or more to those of patients whose transplant functioned for less than 20 years.

Materials and Methods
We retrospectively reviewed all clinical charts of adult patients (≥ 18 years old) who received a kidney transplant between October 24, 1967 and December 31, 1991 at S. Orsola University Hospital Kidney Transplant Centre in Bologna, Italy. Transplants have been performed at this center since 1967, and all patients with functioning grafts are seen for routine follow-up on an annual basis. This cohort was followed for at least 20 years. The outcome of interest was failure of the first transplant. We included single kidney ABO-compatible transplants from living or deceased donors. Multiorgan transplants and second or third kidney transplants were excluded. The protocol was approved by our Institutional Ethics Committee.
The study cohort was divided into two groups: patients with a kidney transplant surviving for at least 20 years (ULS group) and patients with a kidney transplant surviving less than 20 years (Standard Survival, SS group). The two groups were compared for these variables: demographic (gender, donor and recipient age, and living or deceased donor), clinical (cold ischemia time, delayed graft function [DGF] duration, and immunosuppressive therapy at discharge after transplantation), immunological (number of human leukocyte antigen [HLA] mismatches, peak percentage of panel reactive antibodies [PRA], and number of acute rejection episodes), and laboratory (1-year creatinine and estimated glomerular filtration rate [eGFR; calculated using the Modification of Diet in Renal Disease formula]). HLA status was defined by HLA A and B loci in the 1970s and extended to HLA A, B, and DR loci in the 1980s.
DGF was defined as dialysis requirement in the first week posttransplant. DGF duration was defined as the number of days elapsed between the transplant and the last dialysis session.

Immunosuppressive therapy
In our series all patients were treated with intravenous methylprednisolone at the dose of 500 mg on the first day posttransplant, 250 mg on the second day and 125 mg on the third day. Subsequently it was tapered and then replaced with oral prednisone at the maintenance dose of 5 mg daily.
Prior to 1985 almost all patients, in our center, received steroids plus azathioprine (2 mg/Kg once daily) during the hospitalization and at discharge. From 1983 cyclosporine was introduced gradually, initially reserved for transplants at higher immunological risk and then used systematically.
Cyclosporine was administered at a starting dose of 8 mg/Kg twice daily and subsequently adjusted in order to obtain trough levels of 250-300 ng/mL until the sixth month post-transplant and of 150-200 ng/mL after six months.
In this study, due to the length of follow-up, it was not possible to trace all changes of therapies during the life of the transplant. For this reason, we evaluated the therapy at discharge.

Statistical analysis
Continuous variables were summarized as mean ± standard deviation, except where otherwise indicated. Comparisons between study groups were performed using t-tests, χ 2 tests, or Mann Whitney tests when appropriate. Variables that differed between the ULS and SS groups at a p-value ≤ 0.10 in univariate analyses were included in multivariate logistic regression analyses. To determine which combinations of demographic and diagnostic variables best distinguished SS from ULS patients, data were analyzed using a classification tree analysis (CTA) based on a chi-square automatic interaction detection procedure (CHAID). Regardless of the graft survival status, patients treated with azathioprine were compared with those treated with cyclosporine or the combination of azathioprine plus cyclosporine for a number of characteristics, using the same statistical tests. The significance level was set at p<0.05. All statistical analyses were performed using IBM SPSS Statistical Software, version 20.0.

Results
The study cohort included 452 patients with a mean age of 35.6 ± 10.9 years. Of these, 28 (6.9%) underwent kidney transplantation between 1967 and 1976, and the remaining 424 underwent transplantation thereafter. The mean graft survival was 11.05 ± 9.42 years in the overall cohort. The median estimated survival was 14.2 years (95% confidence interval, 12.2-16.2 years). The ULS group included 111 (24.5%) patients and the SS group included 341 (75.5%) patients. The ULS group had a mean graft survival of 24.6 ± 4.2 years, and the SS group had a mean graft survival of 6.6 ± 5.7 years. Fifteen  Patient survival was 92.3% at 1-year, 84.2% at 5-year, 75.9% at 10year and 64.8% at 20-year. As for the patient death we found the following causes: cardiovascular disease (31.8%), infection (30.3%), malignancy (10.7%), other (18.9%), unknown (9.0%). Table 1 shows the characteristics of the patients and donors in the two groups. In univariate analysis, ULS patients were significantly more likely than SS patients to be female and have a younger and living donor; they were less likely to have a longer DGF.  Multiple logistic regression analysis including recipient gender and age, donor type (living or deceased), and DGF duration revealed that female recipients were 2.4 times more likely than men to be ULS. In fact men had a higher mortality rate than women (30.5% vs. 18.7%, χ²=6.71, p=0.01). A shorter DGF duration was also associated with an increased likelihood of being in the ULS group. Recipient age and donor type were not related with graft survival (Table 2).   Classification tree analysis was performed in the full cohort to predict long-term graft survival as a function of recipient age, gender, type of donor (living or deceased), acute rejection, DGF duration, ischemia time, donor age, and creatinine levels at 1 year after transplantation. The results indicated, after including serum creatinine levels, no other variable was useful to discriminate between ULS and SS. Specifically, patients with serum creatinine ≤ 1.1 mg/dL were more likely to be ULS (64.7%) than those with higher serum creatinine levels (1.1-1.5 mg/dL, 36.2%; 1.5-1.8 mg/dL, 13%; and >1.8 mg/dL, 1.4%) (Figure 2). A second CTA in which serum creatinine was replaced with eGFR, showed that patients with a eGFR >61 mL/min/1.73 m 2 were more likely to be ULS (52.9%) than those with a eGFR of 46-61 mL/min/1.73 m 2 (24.8%) or <46 mL/min/1.73m 2 (7.4%) (Figure 3). When comparing patients with a creatinine ≤ 1.1 mg/dL at 1 year after kidney transplant to those with a serum creatinine >1.1 mg/dL, only gender, among the demographic or clinical characteristic considered, differed between groups: specifically female gender was significantly associated with serum creatinine ≤ 1.1 mg/dL (females in creatinine ≤ 1.1 mg/dL group vs. females in >1.1 mg/dL group: 56.5.0% vs. 24.7%, χ²=26.3, p<0.001).
Secondary analyses were conducted to compare the numbers of matched alleles for HLA A, B, and DR genes between the two groups in the 225 patients with available data; no statistical difference were found between groups. Similarly, no difference between groups was found for the total number of HLA mismatches.
Regarding immunosuppressive therapy, most (97.4%) patients in the study cohort were treated with steroids at discharge; of these 55.3% received steroid plus azathioprine, 38.9% received steroids plus cyclosporine and 5.8% received combination treatment (azathioprine plus cyclosporine plus steroid).
The majority of ULS had been treated with cyclosporine at discharge (39.6% versus 34%) although the difference was not statistically significant (p= 0.337) ( Table 1).
The comparison of the different therapeutic regimens used showed that the association of cyclosporine plus steroid was more frequent among ULS (40.4% VS 38.4%), while the association of azathioprine plus steroid was more frequent in the SS group (57.3% VS 49.5%); the use of the combination of cyclosporine and azathioprine was more likely in the ULS group (10.1% VS 4.2%). No significant differences were found in the percentage of ULS and SS patients receiving each treatment.
Lastly, we assessed the risk of having transplant duration lower than a year, between one and twenty years, and more than twenty years according to the treatment with cyclosporine or with other medications. We found that patients treated with cyclosporine had a lower risk of losing their graft in the first year after transplantation (13.0% vs. 20.3%) and a higher likelihood to have a graft lasting longer than 20 years (27.2% vs. 20.7%) with a trend towards statistical significance (p=0.078).

Discussion
The ultra-long-term outcome of kidney transplants has received little attention in the literature to date. In the current study, 24.5% of our single kidney transplant recipients exhibited a 20 year or longer graft survival. This result is comparable to, although slightly higher than, the percentage of 21.7% reported by Traynor et al. [6]. Univariate analyses revealed that female gender, a living donor, younger donor age, shorter DGF duration, and superior renal function (serum creatinine and eGFR) at 1 year after transplantation predicted functional graft survival for at least 20 years. No other clinical, immunological, or laboratory characteristic predicted ultra-long-term graft survival. Subsequent multivariate analysis confirmed that female gender, shorter DGF duration and better renal function at 1 year were associated with a higher likelihood of being one of the ULS.
Previous studies have shown that renal function parameters (creatinine and eGFR) in the first year after transplantation are important predictors of graft survival. One-year serum creatinine concentration has been associated with graft survival at 3 years [8], 5 years [9], and 10 years [10]. The present study, to our knowledge, is the first to examine the relationship between renal function parameters at 1 year after transplant and graft outcome at 20 years or later after transplant: our results indicate that renal function at 1 year is strongly predictive of graft survival for at least 20 years. Specifically, classification tree analysis identified different probabilities of being ULS based on 1-year eGFR levels and 1-year creatinine levels. The creatinine and eGFR cut-off values associated with longer graft survival were similar to those reported in the literature for shorter follow-up periods. We found that patients with a eGFR >61 mL/min had an approximately 53% likelihood of being ULS. Patients with a 1year creatinine <1.1 mg/dL had a 64.7% likelihood of being ULS, whereas in patients with a creatinine >1.8 mg/dL the probability of being a ULS decreased to 1.4%. Fitzsimmons et al. [8] reported that a 1-year serum creatinine ≤ 1.5 mg/dL was associated with a significantly lower 3-year graft loss. Similar results were reported by Hariharan et al. [10] at 5-year follow-up. In a pediatric population, Tejani et al. [9] found that patients with a creatinine clearance <50 mL/min had a greater than one in three likelihood of graft loss within 3 years, with an annual 13% risk of graft failure. These results indicate that renal function can be regarded as a fundamental variable for predicting long-term outcome of kidney transplants [11][12][13]. Others authors have suggested that many immunological and nonimmunological factors with an onset as early as 1 year after transplantation in patients with good renal function are associated with early graft loss. These include the onset of post-transplant donorspecific antibodies [14][15][16], interstitial fibrosis and tubular atrophy [17][18][19] or recurrence of nephropathy [20,21]. We were not able to investigate all factors that may lead to early graft dysfunction after renal transplantation because data for our patients were limited, but our findings suggested that functional decline within the first year after transplantation was correlated with a poorer outcome at 20 years or later.
In our study, a higher percentage of ULS patients were initially immunosuppressed, based on their receipt of cyclosporine although this result did not reach statistical significance.
Looking more in depth data on therapy emerged that in patients treated with cyclosporine graft loss in the first year after transplantation occurred less frequently and that was increased, in this group, the percentage of patients whose transplant survived longer than 20 years with a trend toward statistical significance.
This finding is consistent with that of Traynor et al. [6] and may be attributed to the higher efficacy of calcineurin inhibitors in preventing acute rejection. Our finding that women were more than twice as likely as men to be ULS was similar to the findings of Traynor et al. [6] and Mayer-Kriske et al. [22]. These authors suggested that this may reflect the higher cardiovascular mortality of men. Other factors that may account for the gender difference include the effects of sex hormones or superior treatment compliance in women [23]. Regarding DGF, evidence from the literature indicates that this factor has a negative effect on graft survival [24][25][26][27][28]. We found that not only the presence of DGF, but also its duration, affects kidney transplant outcome: for each additional day of DGF, the probability of being one of the ULS decreased by 5.7%.
Our study found no immunological predictor of long-term graft survival, although before 1980, HLA DR was not typed, and no information was available regarding the formation of post-transplant donor-specific antibodies or the recipient's lymphocyte subpopulations. However, previous studies regarding the association between immunological characteristics and long-term graft outcomes have produced conflicting results. Traynor et al. [6] demonstrated an association between the number of mismatches and transplant outcome, whereas Smail et al. [11] found that PRA class and number of mismatches did not significantly impact the 10-year graft outcome in their population of 516 kidney transplant recipients.
The present study has limitations. Statistical analysis of longitudinal observational data is notoriously challenging, complicated by the potential for unforeseen bias and confounding factors. When interpreting our data, it is also important to bear in mind that the study cohort had unique characteristics differentiating it from current transplant populations. For instance, the donors and recipients were significantly younger than current donors and recipients [29,30]. It is also difficult to assess the impact of immunosuppressive therapy on the course of the transplant in our population. Indeed, we only evaluated the therapy initially received because of the complexities of documenting the shifts in treatment that occurred over such a prolonged period of follow-up. Moreover, most of the therapies were different from those used currently.
In conclusion, graft survival for 20 years or longer is not uncommon nowadays, occurring in approximately one-quarter of the population of single kidney transplant recipients. Our study indicates that renal function at 1 year after transplantation is of fundamental importance for the 20-year survival of kidney transplants. No other parameter characterized ULS, except female gender and shorter DGF duration. These results thus suggest that all efforts must be targeted to strictly monitoring patients in order to preserve graft function in the first year.
In addition, our study shows that there is a population of long-term survival kidney transplant recipients that is not strictly characterized by clinical and immunological determinants: this suggests that other pre-existing conditions, such as patient's genetic and molecular features and graft-related characteristics, may affect graft survival in the long term that could be connected to the phenomenon of immunological tolerance. Further studies are warranted in order to better understand the molecular mechanisms that underlie immune tolerance to transplant.