Article Text
Abstract
AIM On the basis of finalised data from the Corneal Transplant Follow up Study to identify and quantify factors influencing corneal graft outcome in terms of graft survival, rejection, visual acuity, and astigmatism.
METHODS Multifactorial analysis of 2777 grafts registered by the UK Transplant Support Service from July 1987 to June 1991.
RESULTS Several recipient factors influencing graft survival, rejection, and visual acuity were identified, but no donor factors. Of the operative factors amenable to change, mixed suturing was associated with reduced graft survival, and larger grafts with increased risk of rejection but better visual acuity when surviving. There was increased risk of rejection with poor matching at HLA class I antigens, but mismatched HLA-DR grafts suffered less rejection than those with zero HLA-DR mismatches. Recipient age below 10 years was associated with increased risk of both rejection and graft failure. However, whereas increasing age above 10 years was not associated with differential graft survival, it was significantly associated with decreasing risk of rejection.
CONCLUSIONS While confirming possible benefits of HLA-A and B matching, the expense and delay involved in awaiting matched HLA-DR tissue is unlikely to be justified. Other donor factors are unrelated to graft outcome following screening of tissue by eye banks. The highest rates of graft failure and rejection happen in the early postoperative period, and factors influencing visual outcome are also apparent at this stage.
- corneal transplant
- graft failure
- rejection
Statistics from Altmetric.com
The Corneal Transplant Follow up Study (CTFS) sought information on grafts performed in the United Kingdom from July 1987 to June 1991.1 Detailed analysis of an interim dataset related graft outcome to recipient factors,2 donor factors,3 including tissue matching,45 and operative factors.6 Here we present results on the basis of finalised data in order to confirm or refute the findings reported in previous publications.
Methods
Details of data collection1 and analytical method236 are given elsewhere. In summary, donor records were made available by the United Kingdom Transplant Support Service Authority (UKTSSA) and Corneal Transplant Service eye banks. Forms concerning recipient medical history, clinical condition, and operative method were sent to surgeons at or around the time of operation. Follow up forms at 3 and 12 months detailed graft status, rejection episodes, and clinical record.
Outcome was assessed in terms of graft survival, time to first occurrence of rejection, and, for functioning grafts, visual acuity and astigmatism. Multifactorial statistical modelling was undertaken to relate each of these outcomes to graft characteristics. Proportional hazards regression was used for assessment of graft survival and time to first rejection. For visual acuity and astigmatism, multiple linear regression was used following logarithmic and square root transformation respectively. Worked examples of use of the coefficients from these models for individual cases are given elsewhere.6 From the multitude of potential risk factors, those exercising statistically significant (p<0.01) influence over any of the outcomes were identified by a rational selection procedure. Estimates and 99% confidence intervals (CI) of the scale of such influence are presented from statistical models incorporating all such factors.
Results
RESPONSE RATES
Of 4564 grafts recorded by UKTSSA while CTFS was ongoing, 3433 (75%) were registered with CTFS by 381 surgeons returning transplant record forms. Follow up information was not anticipated for 132 grafts, and was provided for 2785 recipients (84% of those anticipated), including 2777 (84%) with sufficient information to be included in analyses of graft survival. Three month forms were returned for 2633 (80%) registered grafts (including 171 recipients reported to have died or been lost to follow up), and 1645 twelve month forms were received (55% of those not known to have failed or be otherwise unavailable).
RECIPIENT FACTORS
Factors exercising significant influence over at least one of the outcome measures were: surgeon experience; recipient age; number of previous grafts; reasons for grafting; diseases in the operated eye; preoperative visual acuity; diagnosis; stromal oedema; active inflammation; deep vascularisation; graft size; suture method; and vitreous surgery (Table 1).
GRAFT SURVIVAL
Overall graft survival was 88% at 12 months (Fig 1). Hazard of graft failure (the risk of a graft failing during a period given that it survived to the start of the period) was highest in the initial 75 day postoperative period, and decreased thereafter to between 1% and 2% per 75 day period.
Patients of consultants who reported in excess of 50 grafts during the course of the study had reduced risk of graft failure (Table 2). Grafts in recipients under 10 years of age, those for purely cosmetic and/or therapeutic reasons, and those using a mixture of continuous and interrupted sutures performed less well. Prognosis was worse for regrafts, and where there was deep vascularisation, active uveal inflammation, or stromal oedema. After allowance for these factors, the effect of diagnosis was not statistically significant.
TIME TO FIRST REJECTION
Rejection free survival was 86% at 12 months (Fig 2). Hazard of first rejection followed a similar pattern to that of graft failure, but remained high for 150 days before decreasing.
Risk of rejection reduced with increasing recipient age, but was increased for those with glaucoma or with diagnoses of secondary endothelial failure, inflammation, and other less common diagnoses (Table 3). Risk of rejection was increased for regrafts, particularly when two or more grafts had previously failed. Large grafts were also at increased risk of rejection.
For this outcome immunological factors were also considered. We found an adverse effect of matching at class II (HLA-DR) antigens (Fig 3). At each level of class I (HLA-A + HLA-B) mismatch, the class II mismatched group was less likely to reject: relative risk (RR) (99% CI) = 0.59 (0.35,1.00). We had little statistical power to detect an effect of matching at HLA class I antigens owing to the preference given in the tissue allocation scheme to minimising class II mismatches. The estimated detriment of class I mismatching versus no mismatch was of similar scale to the benefit of class II mismatching, but with wider confidence limits: relative risk (99% CI) = 1.50 (0.51,4.46).
VISUAL ACUITY
Distributions of preoperative, 3 month, and 12 month visual acuity in functioning grafts performed for visual reasons (that is, excluding those for purely cosmetic or therapeutic reasons) showed improvement at each stage (Fig 4). Whereas 15% of grafts performed for visual reasons had preoperative corrected visual acuity of 6/24 or better, 58% of grafts functioning at 3 months and 70% of those surviving at 1 year reached this mark.
Only 28% of the variability at each follow up could be explained by statistical models incorporating all of the factors under consideration. Visual acuity was best for young adults, and poor following regrafts (Table 4). Not surprisingly those with better preoperative visual acuity fared better, whereas those with glaucoma and other diseases fared less well. Diagnoses of keratoconus, stromal dystrophy, and primary endothelial failure carried the best prognosis. Larger grafts performed well, but combined vitreous surgery reduced acuity, particularly in the short term.
ASTIGMATISM
Distributions of preoperative, 3 month, and 12 month dioptres cylinder (DC) in functioning grafts, excluding those performed for purely cosmetic or therapeutic reasons, showed little pattern (Fig 5). Few eyes were refracted preoperatively, but of those that were 77% had less than 4 DC of astigmatism; 51% of eyes refracted at 3 months, and 57% of those refracted at 1 year reached this mark.
Only 5% of the variability in readings at each follow up could be explained by statistical models incorporating all of the factors under consideration. No factor appeared to be prognostic for 3 month astigmatism. At 12 months, only the 11 recipients aged under 10 years appeared to have significantly reduced astigmatism. For example, where an 80-year-old recipient may have expected 6 DC, a young child with similar graft characteristics would have expected 2.5 DC, with 99% CI from 1.1 DC to 4.5 DC.
Discussion
The Corneal Transplant Follow up Study constituted one of the largest studies of corneal graft outcome undertaken, involving nearly 400 surgeons on a national basis. Prognosis for graft outcome is by nature multifactorial, and the size of this study allowed detailed statistical modelling of multiple factors. All statistical models make assumptions concerning the data, and those presented here have been validated by methods discussed previously.236
Graft survival was 88% at 1 year, which is comparable with the 91% reported by the Australian Corneal Graft Registry.7 The hazard rate after 6 months for both graft failure and first rejection experience, which may ultimately lead to graft failure, was low. This has important implications for future research. Although different risk factors may come into play over longer term follow up, most ‘events’ will occur early in the postoperative period, reducing the need for costly long term follow up. However, there is the possibility of a slight increase in both hazard rates at around the 1 year mark. This may represent increased hazard at the time of suture removal.
Our most controversial finding was that grafts mismatched for HLA-DR were less likely to suffer rejection than those with no HLA-DR mismatch. This may be explained theoretically by the ‘docking hypothesis’.5 The largest clinical study of class II matching, the Antigen Matching Study8 of the Collaborative Corneal Transplantation Studies Research Group, found little effect. They reported a relative risk (95% CI) for graft reaction of 1.10 (0.82,1.48), and suggested that the lack of effect may be due to aggressive immunosuppression. Following the report of our interim findings, we conducted a structured overview of published reports.4 This concluded that even in high risk cases HLA-DR mismatching is at worst associated with minimal harm (upper 95% confidence limit 1.25). We recommend that corneal tissue allocation procedures not be based on knowledge of HLA-DR.
We previously reported2 a relative risk (95% CI) of graft failure of 2.36 (1.21,4.59) for 298 cases with primary endothelial failure versus 457 cases with keratoconus. This result was not apparent in the finalised data with more cases and further adjustment for clinical factors: relative risk (95% CI) was 1.34 (0.69,2.62).
Risk of graft rejection reduced with increasing recipient age. The matching process for most corneas in this series involved some age matching of donor to recipient. While it is possible that the age effect is, therefore, due to age associated changes in donor tissue leading to loss of immunogenicity, use of multifactorial methods implicated the recipient cellular immunity. We have been unable to identify previous reports of such an effect, although this may be due to others’ use of inappropriate statistical methods. Age is associated with factors such as diagnosis, previous graft history, and glaucoma so that effects may be missed by unifactorial analyses. Indeed, reanalysis of our data in isolation from other factors (results not given) misses the recipient age effect.
We speculatively propose the following mechanism for biological plausibility. In normal individuals aging is associated with alteration of T cell subtypes and changing responsiveness to cytokines accompanied by thymic involution. The bulk of the thymic tissue begins to decline at birth and continues at 3% per annum until middle age when it decelerates to 1% per annum.910 Unlike B cell immunity, T cell immunity relies increasingly on clones of committed memory T lymphocytes that may react to immunogenic epitopes with little or no cross reactivity with histocompatibility alloantigens.11-14 The lack of capacity to expand truly novel clones may thus lead to a reduced potential to reject.
Factors influencing visual acuity at 12 months were already apparent in the analysis of short term visual acuity. This was true despite the continuing improvement generally in visual acuity over the intervening period. This suggests that short term follow up may be adequate for research purposes. However, at neither time point was more than 30% of the variability in visual acuity explained. In the case of astigmatism, even less of the variability was explained, despite collecting detailed information on operative methods in addition to preoperative clinical condition.
Only one factor under the control of surgeons was beneficial for one outcome measure but detrimental for another. Large grafts were at increased risk of rejection, but those which survived achieved better visual acuity at both 3 and 12 months. In common with others using multifactorial analyses15 but not those analysing graft size in isolation,716 we previously reported reduced risk of graft failure for larger grafts. This effect was not statistically significant in the finalised data.
Far more is now known concerning corneal transplantation in the United Kingdom than was known at the outset of CTFS. Corneal Transplant Service eye banks store donor tissue in organ culture, allowing time for screening which negates the effect of donor factors on graft outcome.17 This has led to a dramatic increase in the number of registered grafts. However, the size of the donor pool remains static, suggesting that the corneal transplantation community can ill afford to be complacent.18
Further studies should seek to identify reasons for the higher graft survival rates reported by the most experienced surgeons. Differences may in part be due to experience in postoperative care, particularly in the identification and management of astigmatism and other complications.
Acknowledgments
A list of collaborating surgeons is given in the appendix of reference 1.