Steroid Withdrawal After Pediatric Liver Transplantation: A Long-Term Follow-Up Study in 109 Recipients. Transplantation 2003; 75: 1664. H. Vo Thi Diem, E. M. Sokal, M. Janssen, J. B. Otte, and R. Reding Since the discovery of cortisol by Kendall and Wintersteiner 1937, cor- tisone has been indicated for the treatment of more than 200 medical conditions. Corticosteroids paved the way for successful medical immuno- suppression and proved the revers- ibility of rejection. Immunosuppressive medication in transplantation has been and contin- ues to be constantly subject to change. No doubt, protocols will be continuously modified in the future. Because achieving tolerance or even anergy is still unrealistic, and long- term survival is fraught with immu- nosuppressant-specific side effects, innovative researchers have always strove to reduce or withdraw immunosuppression. Three questions regarding with- drawal regularly arise: 1. Which immunosuppressant can be withdrawn? 2. Which patient group will benefit most? 3. When is the time right for withdrawal? Withdrawal of immunosuppression can be implemented in two ways. Ei- ther steroid withdrawal (SW) or substi- tution of calcineurin inhibitors (CI) (first-line drugs: cyclosporine A [CsA], tacrolimus [Tac]) with second-line im- munosuppressants (e.g., azathioprine [AZA], mycophenolate mofetil [MMF], sirolimus [Rapamycin]). The potential benefits of SW are directly related to their side effects, most importantly, growth retardation and osteoporosis. Corticosteroids may be regarded as the weak link of an old-fashioned immunosuppressive re- gime. They are associated with a wide variety of adverse effects, and they may be successfully withdrawn with- out a replacement therapy in greater than 80% of cases (1). The risk (i.e., rate of acute rejection) seems to be inversely related to the length of the interval following transplantation and will occur in approximately 10% to 30% of cases (1). Benefits may also be linked to the time of withdrawal, although evidence is scarce in this matter. Moreover, in rodent models, tolerance induction or blockade of the costimulatory pathway may theoreti- cally favor a steroid-free immunosup- pression (2). So only the fittest immu- nosuppressants will survive! But although we may have given a plausible answer here, did we ask the right question? If we are willing to accept a mar- ginal increase in the rate of late re- jection and graft failure, some evi- dence clearly supports CI withdrawal with substitution of second-line im- munosuppressants. In recent years, we have witnessed a resurrection of dual therapy of antimetabolites com- bined with steroids to avoid the far more severe side effects of CI. Almost all withdrawal (substitution) studies showed significant reduction of ad- verse effects. In a prospective ran- domized study in patients receiving liver transplants, early withdrawal (6 months posttransplant) of CI and replacement by MMF effected a sub- stantial decrease in creatinine levels (P=0.003) and improved control of hypertension (3). In an earlier study in renal transplantation, conversion to AZA and prednisolone (vs. CsA monotherapy) 3 months after trans- plantation significantly improved creatinine levels (P0.001), reduced costs (P0.005), and increased the estimated 5-year survival from 78% to 87% (4). Patients can be easily divided into pediatric and adult groups, but this may be too simplistic. Other more specific criteria such as underlying disease, donor-organ quality, and the type of transplant are likely to exert a more crucial influence on changes in the immunosuppressive protocol than will the age group of the patient. To minimize risks, attempts have been made to predict patients suit- ability for withdrawal by analyzing donor-specific T-cell reactivity. Fur- thermore, there is a growing consen- sus among transplanting physicians that optimal immunosuppression will require tailoring with respect to indi- vidual drug-related toxicity, donor- organ characteristics, and recipient- related risk factors. Because immunosuppressant side effects will invariably occur, timing is important. Two years ago, in the Analyses & Commentaries section of Transplantation, P. F. Halloran (5) mentioned an arbitrary cut-off point of immunologic adaptation between the 6th and 12th month following transplantation. There is consider- able evidence that the pre- and post- adaptation periods are two distinct intervals. In the preadaptation inter- val, the focus should primarily be on low rejection rate and graft survival, which requires a highly efficacious immunosuppression, whereas in the second interval, postadaptation, the focus should be on reducing potential side effects and increasing quality of life. In this issue of the Transplanta- tion, Vo Thi Diem et al. (6) present an interesting retrospective, single-cen- ter experience with 109 pediatric pa- tients following SW after liver trans- plantation. Two points warrant further scrutiny: 1. Selection Bias. In spite of the large group of 500 consecutive pedi- atric liver-transplant patients, only 169 (34%) were initially included case-by-case, on the basis of clinical, biochemical, and histologic evolution. This cohort was then reduced further by loss of another 60 (36%) patients (e.g., follow-up time or data). Finally, roughly 20% of the total patient co- hort was analyzed. Unfortunately, the authors were unable to identify a subpopulation that would substan- tially benefit from SW, an occasional and sometimes valuable by-product of larger retrospective studies. 2. Homogeneity. As in most retro- spective studies, homogeneity is al- most impossible to achieve. Follow- ing SW, the authors observed only marginal improvements in fasting cholesterol, growth rate, and reduc- tion of antihypertensive medication. It was pointed out that all effects could be at least partially attributed to declining CI trough levels or Tac- versus CsA-based immunosuppres- sion. Furthermore, uniformity of the study population is upset by dif- ferent posttransplant intervals and immunosuppressive protocols and the inclusion of living-related donation. Nevertheless, it is a carefully doc- umented, retrospective, single-center study with long-term follow-up and, as such, has its merits. Because in this setting risks were almost negli- gible, the marginal benefits should prompt further prospectively con- trolled SW studies with more sensi- tive and specific parameters. But what can be done to advance our knowledge of immunosuppres- sion withdrawal? As a rule of thumb, both efficacy and adverse effects of immunosup- pressants, and benefits and risks of withdrawal, counterbalance one an- other. Recurrent acute rejection and 1629