Corneal graft rejection presents clinically and in experimental models as opacification and is considered to be the result of endothelial cell dysfunction or loss. However, recovery from opacification can occur suggesting either (a) that new endothelial cells can regenerate if the original cells were lost, or (b) that sufficient numbers of original cells can regain function if the opacification was due to temporary dysfunction. In this perspective, previous experimental studies of allograft rejection plus some new data are reviewed to support the latter mechanism.
- corneal graft rejection
- mouse model
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
Experimental models of corneal grafts have proved invaluable in advancing understanding of the processes involved in graft rejection (for review see Larkin1). Early studies utilised rabbits and, more recently, sheep2 but the availability of a large range of reagents for rodents has permitted more mechanistic studies on rats and mice. Orthotopic keratoplasty in rats was described by Williams and Coster.3 Heterotopic keratoplasty in the mouse was first described in the early 1980s4–6 while the first reports of orthotopic murine corneal grafts were published a decade later.7,8 Orthotopic murine corneal grafts were made possible by advances in surgical instrumentation and the manufacture of fine suture materials. Since this time there has been an increasingly large volume of papers investigating the immunology of murine corneal graft rejection.
Since investigation of the mechanism and treatment of allograft rejection is the central purpose of much of the current research, it is relevant to define precisely what is meant by corneal graft rejection. Clinically and experimentally, rejection of corneal grafts is recognised as opacification of the cornea. It is thought that various components can contribute to opacification or reduced clarity of the cornea including cellular infiltration, new vessel ingrowth, thickening and irregularity of the cornea, and oedema. Accordingly, researchers using the rat and mouse orthotopic keratoplasty models have developed grading systems for corneal graft rejection which incorporate each of these features—cellular infiltration, oedema, opacity, and new vessel ingrowth—and produce an aggregate score. An arbitrary score level is then taken as clinical evidence of rejection. However, it has been shown that the total aggregate score correlates well with the single grading of the level of opacity as a measure of clinical graft rejection. As a result, opacity level is commonly used as the standard means for evaluating rejection and most researchers adhere to a common grading system similar to that described by Sonoda and Streilein9 or She et al.7 By convention, a score of 2 is regarded as clinical evidence of rejection. Some examples of clinical grading are shown in Figure 1a and b.
Graft rejection in the mouse and rat models can thus be compared by simply grading the opacity level alone. Several groups have shown different experience in graft rejection rates using this criterion. The cornea is considered to be an immunologically privileged tissue in an immunologically privileged site and this has been cited as the main reason for the low rates of corneal graft rejection in humans. However, although variably delayed compared to skin grafts, corneal grafts in rats appear to be rejected in 95–100% of cases in almost all studies, with the time of onset of rejection varying from 8 to 18 days.10–12 No suggestion of recovery of corneal clarity has been reported, implying that in these studies rejection, as defined by corneal opacity, was irreversible. In only one study in rats corneal transparency improved in 43% of animals after initial high opacity level of 100% of animals.3 Corneal grafts in mice have a better recovery rate. In most studies, which mostly used fully MHC mismatched allograft techniques, 80–100% of mice have corneal opacity levels greater than 2 (indicating “rejection”) between days 10–25,7–9,13–19 but many of these “rejected” corneas recover clarity, such that by days 50–60 only 45–55% of corneas remain opaque at level ≥2.9,15,17,18 In one study, this early peak in “rejection” was not observed and 45% of the grafts remained clear from day 25 to the end of the experiment (day 50)20; in two further studies using different criteria of rejection, 100% of grafts remained “rejected” from day 18 (mean survival time).14,19 Interestingly, when the donor- host MHC mismatching was reversed,16 the percentage of grafts which cleared after the initial rejection failed to reach more than 20% at 8 weeks,16 as previously found by He et al.8
It thus appears that in both rats and mice almost all animals pass through an early phase (8–21 days post graft) in which the corneal opacity grade reaches a level greater than clinical grade 2 (that is, the donor cornea is clinically “rejected”) but that, in some strain combinations, there is potential for recovery of corneal clarity at a later stage, generally taken as 8 weeks post graft. This biphasic response greatly depends on the strain combination and on the host background.
Other factors also play a part in the success of rodent corneal allografts. It is generally recognised that the surgical procedure is technically demanding and that the trauma of the procedure may affect the outcome. Indeed, in many studies it is normal practice to exclude individual animals as “technical failures,” such as animals that have signs of cataract or wound leak immediately post graft. However, this is the extreme end of a traumatic procedure and even minor damage can have a major influence on the results of grafting. This has been clearly demonstrated by Yamagami et al,21 who compared the outcome of the graft procedure in animals who intentionally had anterior synechiae induced as part of the graft procedure, with animals who received grafts but had no synechiae. The percentage of clear grafts (score <2) in the former group fell to 20% at 8 weeks compared with the latter which achieved the previously reported 50%. Zhang et al22 have developed an “underwater” technique, which minimises damage to the donor endothelium when preparing the donor graft. In spite of this, they observed a 100% opaque grafts (score ≥2) when transplanting fully mismatched corneas using the previously described grading system.9
We have performed series of murine corneal transplantations in the fully mismatched pairing of C57BL/10 (H-2b) donor to BALB/c (H-2d) recipient and compared the outcome of the procedure with similar series of syngeneic (BALB/c to BALB/c) grafts. Our intention was to evaluate the importance of surgical trauma to the eventual success of the graft as assessed by corneal opacity grading.9 We evaluated this by using different methods of suturing the donor cornea (interrupted v running sutures) and different types of needle/suture material.
Since the integrity of the donor endothelium is considered important to the initial survival of the graft, and since it has been reported that the murine donor cornea is particularly susceptible to damage owing to its small size and the risk of folding,22 we assessed our technique for harvesting the donor cornea using low magnification confocal microscopy of the endothelium (Fig 1c, d, e). Donor corneas were removed using an atraumatic technique involving instillation of Microvisc (high molecular weight hyaluronic acid) through a paracentesis incision into the anterior chamber via a 30 gauge cannula. The donor cornea was then removed with curved Vannas scissors and fixed in buffered 4% paraformaldehyde (for 2 hours). The tissue was then rehydrated and the endothelium stained with phalloidin or phalloidin-FITC for actin. It can be seen that using this method the donor endothelium was minimally disrupted and presented as an intact uniform monolayer of cells before grafting.
Method of evaluating clinical graft rejection and analysing data
Conventionally, graft outcome data are presented as “survival” curves in which an arbitrary end point for clinical rejection is set and the percentage of grafts remaining clear is assessed at various time points. We present our data in this manner but have omitted the “retrospective” component used, for instance, by Sonoda and Streilein,9 Joo et al,15 and Haäkovà et al,17,18 (Fig 2). We compared the clinical value of grading rejection using an aggregate score of all clinical features against a single grading of opacity alone. Rejection was taken as a score greater than 517 when the aggregate grading was used and greater than 29 when opacity was used. We observed that the two survival curves coincided almost exactly. This confirms previous observations from other laboratories.9,17,18
Survival curves are valuable in presenting data on graft rejection rates and immediately reveal differences in variously treated groups. Survival curves axiomatically imply that grafts, which do not survive, cannot recover. However, as indicated above, several studies have indicated that opacification of grafts may be transient and a graft, which is considered as “rejected” (that is, opacity ≥2), may become clear at a later time. Presentation of the raw data as opacity grade therefore permits analysis of the transient opacification (Fig 3). In addition, by presenting the range of data or the standard deviation for the group the percentage of “rejected” grafts (as defined by a score ≥2) at any time point can be seen.
However, data presented as opacity curves are not strictly compatible with the same data presented as survival curves. Specifically, if grafts considered as “rejected” at one time point can become clear at a later time point, presentation of the data as “survival curves” becomes untenable since a previously failed, non-survival, “rejected” graft has “survived” at a later time point, and thus presentation of data as “survival curves” (Fig 4) becomes meaningless. We recommend therefore that the data be presented as opacity curves (Fig 3) or that survival rate curves if used (Fig 4) be renamed transparency curves.
Allogeneic versus syngeneic grafts
The effect of these different methods of data presentation can be readily demonstrated. Using the conventional “survival curves” and opacity score curves, it is clear that different information is obtained from the same experimental set (Fig 5A and B). Using “survival curves” (Fig 5A) it can be seen that at 30 days the survival rate for syngeneic grafts is 100% and for allogeneic varies between 10% and 20% depending on which data presentation is used (Figs 4, 5A). At 60 days the figure rises to over 50% (Fig 4). Using the opacity score v time (Fig 5B) it can be seen that even in the syngeneic group there was some degree of opacity but that it never reached a level sufficient to represent rejection as defined by opacity ≥2.
The effect of surgical technique on graft rejection (defined as opacity ≥2 is shown in Figure 6. There was no apparent difference in rejection rates when using the “survival curves” to analyse the data (Fig 6A). Using opacity grading, a majority of grafts with interrupted sutures went through an early transient period of grade ≥2, which declined after a few days in the majority of animals (Fig 6B). This appeared to be related to a combination of increased surface suture knots and the need to remove the interrupted suture on day 7. This early transient “rejection” was not seen with the running suture, which was not removed. However, the eventual results for both the running and interrupted suture techniques followed the same time course with high levels of rejection (100%) by day 33 for interrupted and day 45 for running suture.
The effect of different types of suture material in a slightly smaller graft bed is shown in Figure 7. It can be seen that with the Sharpoint 11/0 nylon suture and needle (50 μm, non-spatulate) and a 1.5 mm diameter graft bed (2 mm graft) the opacity score was not significantly different (p>0.05) from the Mersilene 11/0 polyester suture with spatulate needle (150 μm). However, clinically the grafts performed with the nylon suture appeared much less inflamed. The authors compared recent results with the previous data using the Ethilon 11-0 sutures (Ethicon, UK), which showed less inflammation with the interrupted sutures compared to the Mersilene 11-0 suture (using the same spatulate needle with both suture materials) (data not shown).
As indicated above, clinical graft rejection in humans includes cells in the anterior chamber, changes in the clarity of the cornea (sometimes in the form of various rejection lines), corneal oedema, and ingrowth of new vessels into the donor cornea.1 Graft rejection is considered irreversible when the cornea remains opaque despite treatment. Corneal opacity under these circumstances is usually attributed to oedema or water retention by the stroma and epithelium and, on the basis that the endothelium is the major cell type involved in deturgescence of the cornea, it is assumed that irreversible graft rejection involves permanent loss of the donor endothelium without recovery/replacement from host endothelium.
These concepts have been applied to the experimental model. Initially, evaluation of graft rejection took account of all the above signs,3 but later studies showed that corneal opacity was sufficient to assess rejection9 and this observation is confirmed here. Thus, opacity of the cornea and, by implication, loss of the corneal endothelium, is conventionally accepted as the clinical benchmark of murine corneal graft rejection.
From the above experimental observations several inferences can be drawn. However, a brief comment is important regarding the terms of reference. In particular, we urge greater caution with the use of such terms as rejection, rejection rate, rejection reaction, success, acceptance, failure, survival, and outcome when applied to corneal graft rejection. It is clear that the alloimmune response directed against the cornea is not a sudden and “all or none” event and that the cornea can go through a period of immunological attack during which it may experience impaired function and become temporarily non-transparent (opaque). This has been shown not only in the mouse but also in the rat.9,15,17,18,23–25 Nevertheless, this period may pass and the cornea recovers clarity (a situation seen in both human and experimental models) or it may not pass and the cornea goes into irreversible opacification. Thus the eventual “success” or “failure” of a graft can only be judged by a combination of the level of opacity of the graft and a specific duration, the latter determined by the clinical or experimental conditions under study. Pathologically, rejection of the cornea is easier to define—that is, by total loss of the corneal endothelium, but this criterion is not available to us clinically under most situations and experimentally might be extremely labour intensive if applied generally to model systems.
The inferences we draw therefore from these experimental observations are described here. Firstly, corneal allografts in the mouse have a delayed but eventual high rate of rejection (see Fig 6A) suggesting that the privileged status of the cornea is not as absolute as generally considered.26 This coincides with similar observations in human corneal allografting in which late (>5 years) survival rates are lower than expected.27
Secondly, murine (see Fig 6B) and rat3 corneal allograft rejection, as assessed by corneal opacification, appears to go through a phase of transient rejection when interrupted sutures are used but not when running sutures are used. A minor but less impressive increase in corneal opacity (below levels of clinical rejection) occurs in syngeneic corneas around the same time (see Fig 5B). This early transient peak in corneal opacity follows closely the time of removal of interrupted sutures and is likely to represent non-specific inflammation (otherwise known as the innate immune responses) rather than immunological rejection, particularly since it is observed to a degree in syngeneic grafts. This suggests that although an innate immune response precedes and is probably required to trigger adaptive immunological rejection, the difference in the induced innate immune response in eyes treated by interrupted or running suture is not sufficient to affect the eventual outcome of graft transparency.
Thirdly, the precise interpretation of the early transient opacification is unclear. If it can be assumed that the donor endothelium has been lost at this time and the corneal opacity is caused by water retention in the stroma/epithelium as for human corneas, recovery of corneal clarity implies that the endothelium has recovered and that the denuded donor Descemet's membrane has been repopulated by (presumably) host endothelium. Previous studies in rabbit and rat in which the endothelium of the cornea has been destroyed by, for example, full thickness cryotherapy or other types of injury,28–30 have shown that the host endothelium has the potential to recover. Even the human corneal endothelium has been shown to express markers of cell proliferation after injury.31 However, these experiments have been performed in the ungrafted, intact animal. A recent study in the rabbit has shown that the transplanted, endothelial cell denuded, syngeneic donor cornea can be repopulated from healthy host corneal endothelium, once again demonstrating the capacity of the rabbit endothelium to recover in an autologous system.32 No similar studies have been performed in the mouse or rat cornea after allografting, and there is no information on whether host corneal endothelium can replace donor corneal endothelium after it has been destroyed during allograft rejection. Most recently, a study to address this question was performed in the green fluorescent protein-transgenic (GFP-Tg) mouse model in which GFP is expressed in all cells except red cells and hair follicles.33 Graft rejection was assessed at 8 weeks and it was found that clear allografts retained donor endothelium while rejected allografts had no endothelium, as expected. There was apparently no attempt by the host endothelium to grow onto the posterior surface of denuded rejected donor grafts despite its known capacity for proliferation after injury (see above).
The alternative explanation for the early phase of rejection (see Fig 2) is that the donor endothelium can go through a period of dysfunction when its ability to deturgesce the cornea is impaired, possibly by alloreactive T cells and macrophages, but that it can recover its normal function if these inflammatory cell responses are limited. In the GFP-Tg mouse experiment described above,33 no comment was made on whether any of the clear allografts had gone through a period of opacification. This is important since, assuming that at least some of these grafts had become transiently opaque, their eventual clearing indicates that the allograft response has been checked and that the corneal endothelium or other component of the ocular microenvironment has the potential to regulate the immunological response to the allograft. It is likely that this transient period of endothelial dysfunction is mediated either by allospecific cytotoxic T cell attack or by non-specific, innate immune cell attack. In both cases, these responses may be limited perhaps through Fas-FasL mediated mechanisms in the case of allo-attack, and by stimulus removal in the case of innate attack. Further investigation of this aspect of graft rejection is important since it parallels the “rejection episodes” in human corneal graft. Interestingly, these episodes in human corneal graft are also frequently induced by non-specific injury such as suture removal or minor infections.27
There are several conclusions that can be drawn from this review. Firstly, corneal opacity or oedema is frequently a measure of endothelial dysfunction and, when it occurs following allografting, it suggests that the endothelium is damaged. It does not necessarily imply that the endothelium has been lost and that graft rejection has occurred—that is, the corneal endothelium may be under immunological (innate or adaptive) attack but is not necessarily lost. However, corneal opacity is a good, and probably the most reliable, clinical parameter of the alloimmune inflammatory response in the experimental model.
Secondly, an explanation for the transient opacification of some allografts with later clearing of the graft indicates that opacification is reversible and that the endothelium can recover. At present, there is no evidence which would favour one or other of the two possible explanations for this endothelial cell recovery—that is, that the donor endothelium has been repopulated by host endothelium or that a transient alloimmune response directed against the endothelium has been held in check. Further research on this important question is necessary.
Thirdly, the absence of a transient “rejection episode” or opacification of the cornea in mice with running sutures indicates that minor trauma (for example, suture removal) and induction of the innate immune response can trigger this event.
Lastly, the eventual high rate of rejection in the mouse (and rat) indicates that graft rejection in the cornea is not absolutely protected from rejection by immune privilege but that what protection is afforded by immunological privilege is relative and is quantifiable by the delay in time to rejection compared to other tissues such as the skin.
Supported by the grant of Royal College of Surgeons of Edinburgh (UK), University of Aberdeen Development Trust (UK), NATO Linkage grant CRG.LG972853 (UK), grant MSM 111100005 (Czech Republic), and grant IGA MZ NI 6019–3/2000 (Czech Republic)