JBJS

PGE2 Ameliorates Aging-Aggravated Rotator Cuff Muscle Atrophy

J Bone Joint Surg Am. 2025 Jun 6. doi: 10.2106/JBJS.24.00866. Online ahead of print.

ABSTRACT

BACKGROUND: The aging-related escalation of muscle degeneration impacts the structure and function of rotator cuff muscles, contributing to spontaneous and tear-induced muscle atrophy. This study investigated how prostaglandin E2 (PGE2), a regulator of muscle regeneration, influences muscular structure and mitochondrial function in aged mice by using SW033291 to inhibit PGE2 degradation, revealing potential therapeutic pathways for mitigating rotator cuff muscle deterioration.

METHODS: A total of 20 young (5 to 6-month-old) and 100 aged (18 to 20-month-old) female C57BL/6J mice were divided into 2 groups: the first group included young, aged, and aged+SW033291 subgroups and was used to study sarcopenia, and the second group consisted of tear, tear+repair, and tear+repair+SW033291 subgroups and was used to examine the outcomes following a rotator cuff tear (RCT). Tissue staining, muscle mass assessments, functional assays, and mitochondrial function tests were performed.

RESULTS: Rotator cuff muscle degeneration was observed in the setting of natural aging and in the setting of an RCT. These conditions together worsened muscle atrophy and fatty infiltration into the muscle, with the aged tear group demonstrating a decrease in muscle mass from a mean and standard deviation of 45.45 ± 4.04 to 25.18 ± 1.82 mg (p < 0.001) and a reduction in fiber cross-sectional area (CSA) from 1,697.3 ± 108.4 to 1,263.0 ± 56.8 μm2 (p < 0.001). This was linked to increased 15-prostaglandin dehydrogenase (15-PGDH) activity and a reduction in PGE2 levels in the aged tear group (from 2.897 ± 0.177 to 1.873 ± 0.179 ng/g muscle; p < 0.001). SW033291 treatment increased the level of PGE2, reversing muscle atrophy by mitigating mitochondrial dysfunction in both models, as demonstrated by a muscle mass of 33.50 ± 3.05 mg and a CSA of 1,423.6 ± 81.3 μm2 in the presence of both conditions.

CONCLUSIONS: These findings support the hypothesis that elevated PGE2 levels can improve muscle health by reversing mitochondrial dysfunction, offering a strategy to combat sarcopenia and to enhance rotator cuff repair.

CLINICAL RELEVANCE: Large or massive RCTs are associated with muscle atrophy, a higher retear rate, and suboptimal surgical outcomes, especially in elderly patients. This study showed that the occurrence of rotator cuff muscle degeneration and muscular mitochondrial dysfunction in both the natural aging and RCT mouse models was mitigated by enhanced PGE2 levels. This finding demonstrates the efficacy of the application of a 15-PGDH inhibitor and suggests a possible new therapeutic approach.

PMID:40479501 | DOI:10.2106/JBJS.24.00866

Immobilization Time for Conservative Treatment of Distal Radial Fractures in Elderly Patients: A Randomized Controlled Trial

J Bone Joint Surg Am. 2025 Jun 5. doi: 10.2106/JBJS.24.01480. Online ahead of print.

ABSTRACT

BACKGROUND: The management of distal radial fractures (DRFs) in elderly patients remains controversial. Although conservative treatment with cast immobilization is widely accepted, the optimal duration for immobilization is unclear. This study aimed to compare pain control, functional outcomes, and complication rates between 4-week and 6-week immobilization periods in elderly patients treated nonoperatively for displaced DRFs.

METHODS: A single-center randomized controlled trial was conducted, including 150 patients who were ≥65 years of age and had displaced DRFs. Patients were randomized into 2 groups: 4-week immobilization and 6-week immobilization. Pain was assessed using a visual analog scale (VAS) at 10 days after removing the cast and then at 3, 6, and 12 months after injury. Functional outcomes were measured using the Patient-Rated Wrist Evaluation (PRWE) and QuickDASH (the abbreviated version of the Disabilities of the Arm, Shoulder and Hand questionnaire) at 3, 6, and 12 months. Radiographs were reviewed for malunion, and complications and range of motion were also evaluated.

RESULTS: In the 135 patients analyzed, no differences were observed in pain or functional outcomes between the 2 groups at any time point. VAS scores 10 days after the cast removal were similar (3.87 for the 4-week immobilization group and 4.00 for the 6-week group; p = 0.67), as were PRWE scores (14.18 for the 4-week group and 15.51 for the 6-week group; p = 0.686) and QuickDASH scores (15.46 for the 4-week group and 17.86 for the 6-week group; p = 0.449) after 1 year. The malunion rates were 29.9% in the 4-week group and 32.8% in the 6-week group (p = 0.85), and there were no significant differences in complications or range of motion between groups.

CONCLUSIONS: A 4-week immobilization period provided equivalent pain control, functional outcomes, and complication rates as a 6-week immobilization period in elderly patients with displaced DRFs treated nonoperatively. Therefore, a shorter immobilization period may be safely recommended for treating these fractures.

LEVEL OF EVIDENCE: Therapeutic Level I. See Instructions for Authors for a complete description of levels of evidence.

PMID:40472139 | DOI:10.2106/JBJS.24.01480

The Future Is Mobile: Pilot Validation Study of Apple Health Metrics in Orthopaedic Trauma

J Bone Joint Surg Am. 2025 Jun 4. doi: 10.2106/JBJS.24.00842. Online ahead of print.

ABSTRACT

BACKGROUND: Surgeons often lack objective data on patient functional outcomes, particularly as compared with the patient's baseline. The present study aimed to determine whether gait parameters recorded on Apple iPhones provided longitudinal mobility data following lower-extremity fracture surgery that matched clinical expectations. We hypothesized that iPhones would detect the mobility changes of injury and early recovery, correlate with patient-reported outcome measures, and differentiate nonunion.

METHODS: This cross-sectional study included 107 adult patients with lower-extremity fractures who owned iPhones and had at least 6 months of follow-up. Participants shared Apple Health data and completed Patient Reported Outcomes Measurement Information System (PROMIS) surveys. The primary outcome was the daily step count. Four other gait-related parameters were analyzed: walking asymmetry, double support, walking speed, and step length. Mixed-effects models compared mobility parameters at pre-injury, immediate post-injury, and 6-months post-injury time points. Correlations between mobility parameters and PROMIS surveys were assessed. A mixed-effect model evaluated the relationship between step count recovery and surgery for nonunion.

RESULTS: There was a 93% reduction in daily step count from the pre-injury period to the immediate post-injury period (95% confidence interval [CI], -94% to -93%). Other gait parameters also showed increased impairment from pre-injury to post-injury. At 6 months, step count improved sixfold relative to the immediate post-injury period but remained 52% below baseline (95% CI, -55% to -49%). PROMIS Physical Function correlated moderately with step count (r = 0.42; 95% CI, 0.25 to 0.57) and weakly with other gait parameters. Patients with a known nonunion had a 55% slower recovery of step count than those without a nonunion (95% CI: 44% to 66%).

CONCLUSIONS: Apple Health mobility parameters captured changes in mobility following lower-extremity fracture and throughout the subsequent recovery period. These metrics distinguished between patients with and without nonunions, demonstrating their potential usefulness as objective, real-world functional outcome measures. These "digital biomarkers" may aid clinical decision-making and research and could be utilized for the early identification of patients at risk for poor outcomes.

LEVEL OF EVIDENCE: Prognostic Level III. See Instructions for Authors for a complete description of levels of evidence.

PMID:40465739 | DOI:10.2106/JBJS.24.00842

Long-Term Mortality Associated with Periprosthetic Infection in Total Hip Arthroplasty: A Registry Study of 4,651 Revisions for Infection

J Bone Joint Surg Am. 2025 Jun 3. doi: 10.2106/JBJS.24.01629. Online ahead of print.

ABSTRACT

BACKGROUND: While the morbidity associated with revision total hip arthroplasty (THA) or periprosthetic infection (PJI) has been well characterized, less is known about the risk of mortality. With this study, we aimed to determine the long-term mortality associated with revision THA for PJI and associated risk factors.

METHODS: Data from the Australian Orthopaedic Association National Joint Replacement Registry (AOANJRR) were used to study mortality associated with THA procedures for osteoarthritis and subsequent revisions from September 1999 through December 2022. Kaplan-Meier estimates of survivorship and standardized mortality ratios (SMRs) based on Australian period life tables were used to summarize the overall survival following the primary and first revision THA. Risk factors associated with mortality were identified using Cox proportional hazards models, adjusted for age and gender.

RESULTS: There were 548,061 primary THA procedures for osteoarthritis; 4,651 first revision procedures for infection and 15,891 first revisions for reasons other than infection and fracture were recorded. At 5, 10, and 15 years, the cumulative mortality rate for revision for PJI was 14.5%, 34.7%, and 57.5%, respectively. Patients who underwent revision for PJI had higher mortality rates than expected compared with the general population, and the corresponding SMR (1.31; 95% confidence interval [CI]: 1.24 to 1.39) was greater than that for patients undergoing primary THA (0.81; 95% CI: 0.81 to 0.82) or aseptic revision (0.95; 95% CI: 0.92 to 0.99). A higher SMR following revision for PJI was observed in patients <65 years of age and in female patients, and continued to increase beyond 15 years. There were no differences in mortality rates according to whether a major or minor revision was performed to manage PJI.

CONCLUSIONS: Patients revised for infection had increased mortality rates compared with the general population and those undergoing primary THA or aseptic revision. This excess risk persisted beyond 15 years, especially in younger patients.

LEVEL OF EVIDENCE: Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.

PMID:40460198 | DOI:10.2106/JBJS.24.01629

Myocardial Infarction Prior to TKA Is Associated with Increased Risk of Medical and Surgical Complications in a Time-Dependent Manner

J Bone Joint Surg Am. 2025 Jun 2. doi: 10.2106/JBJS.24.01210. Online ahead of print.

ABSTRACT

BACKGROUND: There has been minimal literature evaluating how a prior myocardial infraction (MI) influences outcomes after total knee arthroplasty (TKA). Thus, the purpose of this study was to evaluate how the timing, type, and treatment of MI prior to TKA affect postoperative cardiac complications, general medical complications, and surgical complications.

METHODS: A retrospective comparative study was conducted using a large insurance database. Patients undergoing primary TKA for osteoarthritis were included. Patients who had experienced MI within 2 years before TKA were identified and were matched 1:4 with patients who had not had such an MI on the basis of demographic variables and comorbidities. Patients who had a prior MI were stratified into 4 groups based on the timing of the MI: 0 to <6 months, 6 to <12 months, 12 to <18 months, and 18 to 24 months before TKA. The rates of postoperative cardiac, general medical, and surgical complications were compared between groups. Subanalyses on the prior MI type, treatment, and location were performed.

RESULTS: Prior MI was associated with increased risks of postoperative MI (odds ratio [OR], 3.97 [95% confidence interval (CI), 3.20 to 4.93]), heart failure (OR, 1.45 [95% CI, 1.24 to 1.75]), and 90-day mortality (OR, 2.15 [95% CI, 1.41 to 3.28]). The risk of postoperative MI was highest for those with MI within 6 months before TKA (OR, 6.86 [95% CI, 5.34 to 8.82]). Type-1 MI, ST-elevation MI (STEMI), non-ST-elevation MI (NSTEMI), and anterior and inferior MIs were linked to elevated postoperative MI and/or mortality risks, with timing closer to surgery further amplifying the risk. Percutaneous coronary intervention within 6 months before TKA also increased postoperative risks. Type-2 MI within 6 months before TKA was associated with an increased risk of periprosthetic joint infection compared with controls (OR, 4.23 [95% CI, 1.67 to 10.67]).

CONCLUSIONS: Patients who had a prior MI, particularly within 6 months before TKA, had significantly elevated risks of postoperative MI, heart failure, and mortality. Outcomes varied by MI type, treatment, and location, with type-1 MIs and STEMIs increasing the postoperative mortality risk.

LEVEL OF EVIDENCE: Prognostic Level III. See Instructions for Authors for a complete description of levels of evidence.

PMID:40455939 | DOI:10.2106/JBJS.24.01210

Evaluating Artificial Intelligence-Based Writing Assistance Among Published Orthopaedic Studies: Detection and Trends for Future Interpretation

J Bone Joint Surg Am. 2025 May 30. doi: 10.2106/JBJS.24.01462. Online ahead of print.

ABSTRACT

BACKGROUND: The integration of artificial intelligence (AI), particularly large language models (LLMs), into scientific writing has led to questions about its ethics, prevalence, and impact in orthopaedic literature. While tools have been developed to detect AI-generated content, the interpretation of AI detection percentages and their clinical relevance remain unclear. The aim of this study was to quantify AI involvement in published orthopaedic manuscripts and to establish a statistical threshold for interpreting AI detection percentages.

METHODS: To establish a baseline, 300 manuscripts published in the year 2000 were analyzed for AI-generated content with use of ZeroGPT. This was followed by an analysis of 3,374 consecutive orthopaedic manuscripts published after the release of ChatGPT. A 95% confidence interval was calculated in order to set a threshold for significant AI involvement. Manuscripts with AI detection percentages above this threshold (32.875%) were considered to have significant AI involvement in their content generation.

RESULTS: Empirical analysis of the 300 pre-AI-era manuscripts revealed a mean AI detection percentage (and standard deviation [SD]) of 10.84% ± 11.02%. Among the 3,374 post-AI-era manuscripts analyzed, 16.7% exceeded the AI detection threshold of 32.875% (2 SDs above the baseline for the pre-AI era), indicating significant AI involvement. No significant difference was found between primary manuscripts and review studies (percentage with significant AI involvement, 16.4% and 18.2%, respectively; p = 0.40). Significant AI involvement varied significantly across journals, with rates ranging from 5.6% in The American Journal of Sports Medicine to 38.3% in The Journal of Bone & Joint Surgery (p < 0.001).

CONCLUSIONS: This study examined AI assistance in the writing of published orthopaedic manuscripts and provides the first evidence-based threshold for interpreting AI detection percentages. Our results revealed significant AI involvement in 16.7% of recently published orthopaedic literature. This finding highlights the importance of clear guidelines, ethical standards, responsible AI use, and improved detection tools to maintain the quality, authenticity, and integrity of orthopaedic research.

PMID:40446076 | DOI:10.2106/JBJS.24.01462

Five-Year Functional Outcomes After Acetabular Labral Repair with and without Bone Marrow Aspirate Concentrate

J Bone Joint Surg Am. 2025 May 30. doi: 10.2106/JBJS.24.00602. Online ahead of print.

ABSTRACT

BACKGROUND: Bone marrow aspirate concentrate (BMAC) augmentation at the time of hip arthroscopy is a potential solution to improve functional outcomes in patients with cartilage damage concomitant with acetabular labral tearing; however, follow-up functional scores to date have not exceeded 24 months. Therefore, the present study compares minimum 5-year outcomes in patients treated with or without BMAC augmentation to address chondral damage during arthroscopic labral repair.

METHODS: This was a prospective cohort study analyzing patients who underwent acetabular labral repair performed by a single surgeon. Patients were stratified into either the BMAC cohort or the control cohort depending on whether BMAC was utilized in conjunction with arthroscopic labral repair. Demographic and intraoperative variables, including chondrolabral junction breakdown and articular cartilage damage, were compared between cohorts, as were patient-reported outcome measures (PROMs) at enrollment and at 3, 6, 12, 24, and 60 months postoperatively.

RESULTS: Eighty-one hips were included for analysis: 39 (38 patients) in the BMAC cohort and 42 (39 patients) in the control cohort. Univariate analyses demonstrated similar baseline characteristics between groups, including body mass index, Tönnis angle, lateral center-edge angle (LCEA), and alpha angle (p > 0.05 for each). Patients treated with BMAC and patients in the control group reported similar PROMs between enrollment and the 12-month follow-up. By the 24-month follow-up, patients treated with BMAC reported significantly higher scores for the modified Harris hip score (mHHS) (p = 0.004), the International Hip Outcome Tool-33 (iHOT-33) (p = 0.012), and the Hip Outcome Score-Activities of Daily Living (HOS-ADL) (p = 0.008). This trend persisted over time, with the BMAC cohort demonstrating significantly higher scores for the mHHS (p < 0.001), iHOT-33 (p = 0.006), and the Hip Outcome Score-Sports Subscale (HOS-SS) (p = 0.012) at 60 months.

CONCLUSIONS: Patients undergoing acetabular labral repair with BMAC augmentation reported significantly greater functional improvements compared with patients undergoing repair without BMAC. These differences generally did not become significant until 24 months after surgery, at which point they increased in magnitude until the 60-month follow-up. These findings, the first intermediate-term outcomes reported following hip arthroscopy with BMAC, therefore suggest favorable benefit at an extended follow-up.

LEVEL OF EVIDENCE: Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.

PMID:40446023 | DOI:10.2106/JBJS.24.00602

Completely Displaced Midshaft Clavicular Fractures with Skin Tenting in Adolescents: Results from the FACTS Multicenter Prospective Cohort Study

J Bone Joint Surg Am. 2025 May 30. doi: 10.2106/JBJS.24.00083. Online ahead of print.

ABSTRACT

BACKGROUND: Skin tenting is a commonly utilized surgical indication for clavicular fractures. The impact of skin tenting on fracture outcomes has not been investigated in adolescents. The present study compared the clinical and patient-reported outcome measures (PROMs) of nonoperatively and operatively treated adolescent clavicular fractures with skin tenting at presentation.

METHODS: Patients 10 to 18 years old with completely displaced midshaft clavicular fractures managed at 8 participating institutions from 2013 to 2022 were filtered to identify a cohort with either of 2 categories of skin tenting at initial presentation: (1) "skin tenting" or (2) "skin-at-risk for necrosis" (i.e., tented, white, and hypovascular). Demographics, fracture characteristics, treatment, complications, time to return to sport, and PROMs (i.e., American Shoulder and Elbow Surgeons score; Quick Disabilities of the Arm, Shoulder and Hand; Marx Shoulder Activity score; and European Quality of Life visual analog scale [EQ-VAS]) were analyzed at a minimum of 1-year follow-up.

RESULTS: A total of 88 (12%) of 764 prospectively enrolled adolescents with completely displaced midshaft clavicular fractures presented with skin tenting. Patients with skin tenting had older age and greater comminution, shortening, and superior displacement than those without skin tenting. A total of 58 patients with skin tenting (66%) underwent open reduction and internal fixation (ORIF), and 30 (34%) underwent nonoperative treatment, none of whom developed skin-related complications. However, 3 patients in the nonoperative cohort (10%) underwent early conversion to ORIF at a mean of 27 days (range, 6 to 62 days) post-injury. Although the nonoperative cohort was an average of <1 year younger than the ORIF cohort (nonoperative cohort, 14.5 years; ORIF cohort, 15.4 years; p = 0.04), there were no differences in sex (p = 0.23), shortening (p = 0.13), superior displacement (p = 0.14), or comminution (p = 0.32) between groups. PROMs were available for 63% of patients 1 or 2 years post-injury, with no differences in the PROMs European Quality of Life 5 Dimensions 5 Level Version (EQ-5D-5L) and EQ-VAS, complications (p = 0.76), or time to return to sport (p = 0.80) between treatment groups.

CONCLUSIONS: In this large cohort of prospectively enrolled adolescent patients with clavicular fractures, 12% of patients with completely displaced clavicular fractures presented with skin tenting, approximately one-third of whom were definitively treated nonoperatively, though 10% of the initial nonoperative cohort underwent early conversion to ORIF. Adolescents with skin tenting treated nonoperatively demonstrated no differences in PROMs, complications, or time to return to sport, compared with patients who underwent ORIF.

LEVEL OF EVIDENCE: Therapeutic Level II. See Instructions for Authors for a complete description of levels of evidence.

PMID:40446020 | DOI:10.2106/JBJS.24.00083

Perioperative Opioid Counseling for Patients Undergoing Anterior Cruciate Ligament Reconstruction: A Randomized Controlled Trial

J Bone Joint Surg Am. 2025 May 29. doi: 10.2106/JBJS.24.00822. Online ahead of print.

ABSTRACT

BACKGROUND: The use of opioids to manage pain after anterior cruciate ligament (ACL) reconstruction remains problematic. This study evaluated the impact of opioid-limiting perioperative pain management education and counseling on postoperative opioid consumption.

METHODS: A parallel-arm, randomized controlled trial was conducted at a single academic institution. We included patients ≥14 years old who underwent ACL reconstruction surgery. Patients undergoing revision ACL surgery or open cartilage procedures, or who had a history of heroin use or opioid use requiring treatment, were excluded. A computer-based system randomly assigned participants in a 1:1 ratio to receive opioid-limiting perioperative pain management education and counseling with instructions to take opioids only as a last resort (treatment group) or traditional perioperative pain management with instructions to take opioids as needed for severe pain to "stay ahead of the pain" (control group). The primary outcome was the total morphine equivalents (TMEs) consumed in the 3 months after surgery. Secondary outcomes included pain measured with the Numeric Rating Scale, sleep quality, opioid prescription refills, and patient satisfaction.

RESULTS: The trial enrolled 121 patients, with a mean age (and standard deviation [SD]) of 29 (12) years (67 [55%] male; 35 African American, 10 Asian, 69 White, and 7 other). Within 3 months after surgery, 60 patients assigned to the treatment group consumed a mean of 46.0 mg of TMEs (SD, 126.1) and 61 patients assigned to the control group consumed 63.6 mg of TMEs (SD, 83.4; p < 0.001). The average score on the Numeric Rating Scale for pain in the first 14 days was 2.5 (95% confidence interval [CI], 2.0 to 2.9) in the treatment group and 2.4 (95% CI, 1.9 to 2.9) in the control group (p = 0.82). Four patients (6.7%) in the treatment group and 6 patients (9.8%) in the control group refilled their oxycodone prescriptions within 3 months after surgery (p = 0.53). Sleep quality and patient satisfaction were similar between groups.

CONCLUSIONS: Among patients undergoing ACL reconstruction surgery, opioid-limiting pain management education and counseling reduced opioid consumption with no observed increase in postoperative pain. Clinicians should consider this easily implementable approach to reduce opioid use among patients undergoing this common procedure.

LEVEL OF EVIDENCE: Therapeutic Level I. See Instructions for Authors for a complete description of levels of evidence.

PMID:40440513 | DOI:10.2106/JBJS.24.00822

Functional and Radiographic Outcomes of Bone Grafting for Severe Glenoid Defects in Reverse Shoulder Arthroplasty: A Minimum 5-Year Follow-up

J Bone Joint Surg Am. 2025 May 28. doi: 10.2106/JBJS.24.01052. Online ahead of print.

ABSTRACT

BACKGROUND: The outcomes of bone grafting for severe glenoid defects in reverse shoulder arthroplasty (RSA) are unpredictable. The purpose of this study was to describe the intermediate-term outcomes of glenoid bone grafting in RSA for severe glenoid defects utilizing a baseplate with a long central post.

METHODS: All patients who underwent glenoid bone grafting for severe glenoid defects during RSA from 2008 to 2018, with a minimum of 5-year follow-up, were included. Preoperative, immediate postoperative, and minimum 5-year postoperative American Shoulder and Elbow Surgeons (ASES) scores and visual analog scale (VAS) pain scores and radiographs were obtained and reviewed. Baseplate failure was defined as gross radiographic baseplate cutout or baseplate revision due to implant loosening.

RESULTS: Of the 56 shoulders that underwent bone grafting, 14 were not available because the patients had died and 1 was excluded because of infection, leaving 41 shoulders available for follow-up. There were 4 shoulders in which the patients were lost to follow-up; therefore, the final follow-up rate was 90% (37 of 41) at a mean of 6.8 ± 2.4 years. There were 17 revision procedures and 20 primary procedures performed. Autograft humeral head was utilized in 16 shoulders, and femoral head allograft was utilized in 21 shoulders. Overall, 8 baseplates failed (allograft, 7 [33.3% failure] of 21; autograft, 1 [6.3% failure] of 16; p = 0.104). Revision surgery (7 [41.2%] of 17) was associated with a higher rate of baseplate failure (p = 0.014) than primary procedures (1 [5%] of 20). The mean time to baseplate failure was 2.1 ± 1.5 years, with 2 cases having failure after 4 years postoperatively. Male sex and a lower Charlson Comorbidity Index were associated with baseplate failure (all p < 0.05). The 5-year overall baseplate survivorship was 78.4%.

CONCLUSIONS: Glenoid bone grafting with RSA for severe glenoid defects had an overall baseplate survivorship rate of 78.4% at the intermediate-term follow-up. Primary RSA with autografting for severe defects yielded survivorship of 95%, whereas revision RSA with allograft reconstruction had poorer survivorship (58.8%). Although primary RSA with autograft reconstruction resulted in a high success rate, revision RSA with allograft reconstruction using a central-post baseplate had an elevated baseplate failure rate and alternative surgical solutions for revision RSA should be considered.

LEVEL OF EVIDENCE: Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.

PMID:40435212 | DOI:10.2106/JBJS.24.01052

Enabling Technology in Fracture Surgery: State of the Art

J Bone Joint Surg Am. 2025 May 27. doi: 10.2106/JBJS.24.00938. Online ahead of print.

ABSTRACT

➢ Three-dimensional (3D) printing and virtual modeling, using computed tomographic (CT) scans as a base for the 3D-printed model, help surgeons to visualize relevant anatomy, may provide a better understanding of fracture planes, may help to plan surgical approaches, and can possibly simulate surgical fixation options.➢ Navigation systems create real-time 3D maps of patient anatomy intraoperatively, with most literature in orthopaedic trauma thus far demonstrating efficacy in percutaneous screw placement using preoperative imaging data or intraoperative markers.➢ Augmented reality and virtual reality are new applications in orthopaedic trauma, with the former in particular demonstrating the potential utility in intraoperative visualization of implant placement.➢ Use of 3D-printed metal implants has been studied in limited sample sizes thus far. However, early results have suggested that they may have good efficacy in improving intraoperative measures and postoperative outcomes.

PMID:40424369 | DOI:10.2106/JBJS.24.00938

Antibiotic Holiday in 2-Stage Exchange for Periprosthetic Joint Infection: A Scoping Review

J Bone Joint Surg Am. 2025 May 26. doi: 10.2106/JBJS.24.01275. Online ahead of print.

ABSTRACT

BACKGROUND: The use of a 2-stage exchange remains a common management strategy for periprosthetic joint infection (PJI). The use of an "antibiotic holiday" before the second stage to confirm the clearance of infection is often employed, but there is little evidence to guide this practice. The aim of this review was to systematically map the literature reporting on the use of an antibiotic holiday as part of a 2-stage revision for chronic PJI and to answer the question: is there a role for an antibiotic holiday in patients undergoing 2-stage exchange arthroplasty for PJI?

METHODS: Given the heterogeneity of the literature on this topic, a Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA)-compliant scoping review was conducted. Two reviewers developed and refined the search strategy and study eligibility criteria and pilot-tested the data charting form prior to data extraction. Data were analyzed descriptively.

RESULTS: Three databases were screened, with 504 full-text articles retrieved for review after screening 2,579 titles and abstracts. Of these, 243 were included for data charting. Most studies (238 of 243; 97.9%) were case series, and the remaining 5 (2.1%) were cohort studies that incorporated a direct comparison between continuous therapy and an antibiotic holiday. Most case series (202 of 238; 84.9%) utilized an antibiotic holiday. The proportion of patients who experienced treatment failure in the continuous therapy group (271 of 2,074 patients; 13.1%) was lower than that in the antibiotic holiday group (2,843 of 17,329 patients; 16.4%; p < 0.001). There was a greater proportion of studies with a between-stage interval of <3 months among case series utilizing continuous antibiotic therapy (66.7%) compared with those utilizing an antibiotic holiday (27.2%; p < 0.001).

CONCLUSIONS: There is no proven superiority of an antibiotic holiday during a 2-stage exchange to treat chronic PJI. Due to the need to extend the duration of the interval between the first and second stages in order to accommodate an antibiotic holiday, patients may be subjected to unnecessary prolongation of their treatment duration without an improvement in outcome.

LEVEL OF EVIDENCE: Therapeutic Level IV. See Instructions for Authors for a complete description of levels of evidence.

PMID:40418706 | DOI:10.2106/JBJS.24.01275

The Effect of Implant Constraint and Ligament Repair on Compartment Balancing After Medial Collateral Ligament Injury in TKA

J Bone Joint Surg Am. 2025 May 23. doi: 10.2106/JBJS.24.01327. Online ahead of print.

ABSTRACT

BACKGROUND: An intraoperative midsubstance injury to the medial collateral ligament (MCL) is a devastating complication of total knee arthroplasty (TKA). No single treatment method has been shown to yield optimal stability. This cadaveric study compared primary MCL repair, increasing prosthetic constraint, and a combination of both techniques on tibiofemoral compartment gapping after an iatrogenic MCL injury.

METHODS: We performed 16 cadaveric, robotic-assisted TKAs (CORI; Smith+Nephew) and recorded tibiofemoral gap measurements at 10°, 30°, 60°, and 90° of flexion with a posterior-stabilized (PS) prosthesis as the control group. The experimental groups had no MCL repair and a PS component, no MCL repair and a varus-valgus constrained (VVC) component, MCL repair with a PS component, and MCL repair with a VVC component. The MCL was repaired with 2 figure-8 nonabsorbable sutures. Gap measurements were manually tensioned by the same surgeon for all specimens. The mean medial tibiofemoral gap with the 3 different methods of interest (the no MCL repair with VVC component group, the MCL repair with PS component group, and the MCL repair with VVC component group) was compared with the control group for the rate of deficit (RD) and was compared with the no MCL repair and PS component group for the rate of improvement (RI). Simple statistics were used to calculate the mean medial balance for the groups, and analysis of variance (ANOVA) modeling was used to determine the mean changes in RD and RI, with significance set at p < 0.05.

RESULTS: The mean RD was highest for the no MCL repair with PS component group at 621.13%, demonstrating an approximately 6-fold increase in medial tibiofemoral gapping compared with the control group. This was followed by the no MCL repair with VVC component group at 93.02%, the MCL repair with PS component group at 65.66%, and the MCL repair with VVC component group at 20.01% (p < 0.001). The mean RI for the MCL repair with VVC component group was highest at 83.08%, meaning that the combination of VVC component and MCL repair resulted in an 83% improvement in medial tibiofemoral gapping from no MCL repair with PS component. This was followed by the MCL repair with PS component group at 76.62% and the no MCL repair with VVC component group at 72.95% (p < 0.001).

CONCLUSIONS: This cadaveric study demonstrates that primary MCL repair with VVC component was the best for minimizing the deficit after an MCL injury and provided the highest RI. MCL repair with PS component and no MCL repair with VVC component were less effective reconstructive choices. This study supports the combination of a simple MCL repair with VVC component as the most stable reconstructive option following an intraoperative MCL injury.

PMID:40408512 | DOI:10.2106/JBJS.24.01327

The Effect of Traction and Spinal Cord Morphology on Intraoperative Neuromonitoring Alerts in Adolescent Idiopathic Scoliosis

J Bone Joint Surg Am. 2025 May 23. doi: 10.2106/JBJS.24.01353. Online ahead of print.

ABSTRACT

BACKGROUND: Patients with apical spinal cord deformity have been shown to be at a greater risk for intraoperative neuromonitoring (IONM) alerts when undergoing posterior spinal instrumented fusion (PSF) for adolescent idiopathic scoliosis (AIS). The use of intraoperative traction during deformity correction has also been associated with an increased risk of IONM alerts. With use of the Spinal Cord Shape Classification System (SCSCS), we investigated the interaction between spinal cord type and the use of intraoperative traction and their impact on IONM alerts during the surgical correction of AIS.

METHODS: A total of 441 consecutive patients who underwent PSF or combined PSF plus anterior spinal fusion (ASF) for AIS between 2003 and 2022 were retrospectively reviewed. Those with major thoracic curves of ≥70° and available preoperative magnetic resonance images (MRIs) were included. Charts were reviewed for IONM alerts and the use of intraoperative traction. Spinal cord morphology was determined using the SCSCS. A multivariable regression model was used to assess the risk factors for an IONM alert.

RESULTS: Preoperative MRIs were available for 102 patients. Type-3 cords were present in 15 (14.7%) of the 102 patients. Intraoperative traction was used in 15 (14.7%) of the 102 patients, including 5 with type-3 cords. Patients with type-3 cords were more likely to have an IONM alert than those with type-1 or 2 cords (40.0% [type 3] versus 12.6% [type 1 or 2]; odds ratio [OR], 4.60; 95% confidence interval [CI], 1.34 to 15.53). No such difference was observed between patients with type-1 cords and those with type-2 cords (12.5% and 12.7%, respectively; p > 0.9999). All patients with type-3 cords placed in intraoperative traction experienced IONM alerts, whereas only 10% of patients with type-3 cords not placed in traction experienced such alerts (p = 0.002). Multivariable regression modeling revealed intraoperative traction to be the only independent risk factor for an IONM alert (OR, 9.37; 95% CI, 2.47 to 38.24).

CONCLUSIONS: This study demonstrated that 14.7% of patients with AIS and curves of ≥70° had a type-3 cord. Intraoperative traction carried a ninefold increased risk of an IONM alert. When intraoperative traction is used for type-3 cords, surgeons should expect IONM alerts to occur. The SCSCS can be condensed into 2 groups for a pediatric population.

LEVEL OF EVIDENCE: Prognostic Level III. See Instructions for Authors for a complete description of levels of evidence.

PMID:40408508 | DOI:10.2106/JBJS.24.01353

Immediate Weight-Bearing Compared with Non-Weight-Bearing After Operative Ankle Fracture Fixation: Results of the INWN Pragmatic, Randomized, Multicenter Trial

J Bone Joint Surg Am. 2025 May 23. doi: 10.2106/JBJS.24.00965. Online ahead of print.

ABSTRACT

BACKGROUND: There has been weak consensus and a paucity of robust literature with regard to the best postoperative weight-bearing and immobilization regime for operatively treated ankle fractures. This trial compared immediate protected weight-bearing (IWB) with non-weight-bearing (NWB) with cast immobilization following ankle fracture fixation (open reduction and internal fixation [ORIF]), with a particular focus on functional outcomes, complication rates, and cost utility.

METHODS: This INWN (Is postoperative Non-Weight-bearing Necessary?) study was a prospective, pragmatic, randomized controlled trial (RCT), with participants allocated in a 1:1 ratio to 1 of 2 parallel groups. IWB from postoperative day 1 in a walking boot was compared with NWB and immobilization in a cast for 6 weeks, following ORIF of all standard types of unstable ankle fractures. Skeletally immature patients and patients with tibial plafond fractures were excluded. The type of surgical fixation was at the surgeon's discretion. Patients were randomized postoperatively by an operating room nurse using computerized block randomization (20 patients per block). Surgeons were blinded until after the operation. The study was multicenter and included 2 major orthopaedic centers in Ireland. Analysis was performed on an intention-to-treat basis. The primary outcome was the functional outcome assessed by the Olerud-Molander Ankle Score (OMAS) at 6 weeks. A cost-utility analysis via decision tree modeling was performed to derive an incremental cost-effectiveness ratio (ICER).

RESULTS: We recruited 160 patients between January 1, 2019, and June 30, 2020, with 80 patients per arm, who were 15 to 94 years of age (mean age, 45.5 years), and 54% of patients were female. The IWB group demonstrated a higher mean OMAS at 6 weeks (43 ± 24 for the IWB group and 35 ± 20 for the NWB group, with a mean difference of 10.4; p = 0.005). The complication rates were similar in both groups, including surgical site infection, wound dehiscence, implant removal, and further operations. Over a 1-year horizon, IWB was associated with a lower expected cost (€1,027.68) than NWB (€1,825.70) as well as a higher health benefit (0.741 quality-adjusted life-year [QALY]) than NWB (0.704 QALY). IWB dominated NWB, yielding cost savings of €798.02 and a QALY gain of 0.04.

CONCLUSIONS: IWB in a walking boot following ankle fracture fixation demonstrated superior functional outcomes, greater cost savings, earlier return to work, and similar complication rates compared with NWB in a cast for 6 weeks. These findings support the implementation of IWB as the routine mobilization protocol following ankle fracture fixation.

LEVEL OF EVIDENCE: Therapeutic Level I. See Instructions for Authors for a complete description of levels of evidence.

PMID:40408465 | DOI:10.2106/JBJS.24.00965

Pages