Τρίτη 22 Νοεμβρίου 2022

Great Toe Transplantation

alexandrossfakianakis shared this article with you from Inoreader

Twitter_Summary_Default.jpg

Semin Plast Surg
DOI: 10.1055/s-0042-1758689

Despite being relatively uncommon in the general population, thumb amputations cause severe disability. More than 3,300 thumb amputations occurred in the United States. The thumb makes up around 40% of the function of the hand. Therefore, losing it would result in significant medical, hospital, and societal costs. Thumb reconstruction surgery's primary goal is to restore grip strength, including the range of motion, fine and tripod pinch, power grasp, strength, and sensibility, while secondary goals include restoring hand aesthetics. In cases of thumb replantation, like-for-like replacement is possible; however, when thumb replantation is not possible, great toe-to-hand transplantation is the best available reconstruction. When compared with other reconstructive options such as osteoplastic thumb reconstruction, pollicization, second toe transplantation, and the use of a thumb prosthesis, great toe transplantation provides superior function and aesthetics. For restoring pinch, sensitivity, strength, and aesthetics of the hand with well-tolerated donor site morbidity, toe to thumb transplantation is regarded as the gold standard.
[...]

Thieme Medical Publishers, Inc. 333 Seventh Avenue, 18th Floor, New York, NY 10001, USA

Article in Thieme eJournals:
< a href="https://www.thieme-connect.com/products/ejournals/issue/eFirst/10.1055/s-00000051" target="_blank" rel="noopener" class="underlink bluelink" tabindex="-1">Table of contents  |  Abstract  |  Full text

View on Web

Asparaginase‐related diabetic ketoacidosis: Analysis of the FDA Adverse Event Reporting System (FAERS) data and literature review

alexandrossfakianakis shared this article with you from Inoreader
Asparaginase-related diabetic ketoacidosis: Analysis of the FDA Adverse Event Reporting System (FAERS) data and literature review

This study investigated the association between asparaginase and diabetic ketoacidosis from the perspective of adverse reaction signal detection and literature review. The reporting odd ratio (ROR) of diabetic ketoacidosis (DKA) caused by l-asparaginase was statistically significant, but there was not a statistical association for DKA caused by pegaspargase. Combined with the results of literature review, we speculated that the asparaginase dosage form may affect the occurrence of DKA.


Abstract

What is Known and Objective

Diabetic ketoacidosis (DKA) may occur during asparaginase use. However, limited by the study population, the association between asparaginase and DKA has not been elucidated. The purpose of this study was to determine the potential association between asparaginase and DKA and analyse related clinical characteristics and possible risk factor.

Methods

Disproportionality analysis with the reporting odd ratio (ROR) was used to detect the adverse reaction signals of asparaginase-associated DKA in Food and Drug Administration Adverse Event Reporting System (FAERS). A literature review was conducted to further analyse clinical characteristics, possible risk factor and something noteworthy in asparaginase-associated DKA.

Results and Discussion

A total of 12 reports of DKA associated with l-asparaginase (l-asp) and 6 reports associated with pegaspargase (PEG-asp) were extracted in FAERS, more than 50% of the cases were classified as serious adverse events. DKA was a positive signal of l-asp (ROR = 2.397, 95% CI 1.360–4.226), while not closely related to the use of PEG-asp (ROR = 1.602, 95% CI 0.719–3.570). Searched in PubMed, Embase and Web of Science, a total of eight patients were collected. The patients were mainly adolescent patients, aged between 11 and 25 years old with a median age of 16 years. Drug dosage form distribution is unbalanced, 7 patients received l-asp and only 1 received PEG-asp.

What is New and Conclusions

The ROR of KDA caused by l-asp was statistically significant, but there was not a statistical association for DKA caused by PEG-asp. Asparaginase dosage form may affect the occurrence of DKA, but further research is needed.

View on Web

Response assessment by positron emission tomography‐computed tomography as compared with contrast‐enhanced computed tomography in childhood Hodgkin lymphoma can reduce the need for radiotherapy in low‐ and middle‐income countries

alexandrossfakianakis shared this article with you from Inoreader

Abstract

Introduction

The InPOG-HL-15-01, a multicentric prospective study, used a risk-stratified and response-based approach with doxorubicin, bleomycin, vinblastine, and dacarbazine (ABVD) backbone to treat children and adolescents with newly diagnosed Hodgkin lymphoma (HL) and reduce the use of radiation therapy (RT). Children/adolescents with bulky disease or inadequate response at early response assessment (ERA) after two cycles of chemotherapy were assigned to receive RT. For ERA, positron emission tomography computed tomography (PET-CT) was recommended but not mandatory in view of limited access. This study aimed to compare the impact of using contrast-enhanced computed tomography (CECT) and PET-CT on treatment decisions and outcomes.

Methodology

396 patients were enrolled and 382 had an ERA at the assigned time point. Satisfactory response was defined as Deauville score 3 or less for patients undergoing PET-CT and complete response (CR)/very good partial response (VGPR) for patients undergoing CECT. Outcomes of interest incorporate 5 year event-free survival (EFS), EFS including abandonment (EFSa), and overall survival (OS).

Results

At ERA, satisfactory response was documented in 277 out of 382 (72.5%) participants and this was significantly higher in PET-CT (151 out of 186, 81.2%) as compared with CECT-based assessments (126 out of 196, 64.3%) respectively (p value < .001). Amongst the 203 patients with nonbulky disease (wherein the indication for RT was entirely dependent on ERA), 96 out of 114 (84.2%) and 61 out of 89 (68.5%) patients achieved a satisfactory response according to the PET-CT and CECT (p value = .008) respectively and hence a lesser proportion of patients in the PET-CT arm received RT. Despite a lower usage of RT the 5 year OS of both groups, ERA based on CECT (91.8%) versus PET-CT (94.1%) was comparable (p value = .391) and so was the 5 year EFS (86.7 vs. 85.5%, p value = .724).

Conclusion

Use of PET-CT as the modality for ERA is more likely to indicate a satisfactory response as compared with CECT and thereby decreases the need for RT in response-based treatment algorithm for HL-afflicted children. The reduction in the application of RT did not impact the overall outcome and plausibly would lower the risk of delayed toxic effects.

View on Web

Relapse after non‐metastatic rhabdomyosarcoma: The impact of routine surveillance imaging on early detection and post‐relapse survival

alexandrossfakianakis shared this article with you from Inoreader

Abstract

Background

Patients with rhabdomyosarcoma (RMS) whose disease relapses have little chance of being cured, so front-line treatments are usually followed up with surveillance imaging in an effort to detect any recurrences as early as possible, and thereby improve post-relapse outcomes. The real benefit of such routine surveillance imaging in RMS remains to be demonstrated, however. This retrospective, single-center study examines how well surveillance imaging identifies recurrent tumors and its impact on post-relapse survival.

Methods

The analysis concerned 79 patients <21 years old treated between 1985 and 2020 whose initially localized RMS relapsed. Clinical findings, treatment modalities, and survival were analyzed, comparing patients whose relapse was first suspected from symptoms they developed (clinical symptoms group) with those whose relapse was identified by radiological surveillance (routine imaging group).

Results

Tumor relapses came to light because of clinical symptoms in 42 cases, and on routine imaging in 37. The time to relapse was much the same in the two groups. The median overall survival (OS) and 5-year OS rate were, respectively, 10 months and 12.6% in the clinical symptoms group, and 11 months and 27.5% in the routine imaging group (p-value .327). Among patients with favorable prognostic scores, survival was better for those in the routine imaging group (5-year OS 75.0% vs. 33.0%, p-value .047).

Conclusion

It remains doubtful whether surveillance imaging has any real impact on RMS relapse detection and patients' post-relapse survival. Further studies are needed to establish the most appropriate follow-up recommendations, taking the potentially negative effects of regular radiological exams into account.

View on Web

Marginal bone loss around dental implants: comparison between matched groups of bruxer and non‐bruxer patients: A retrospective case–control study

alexandrossfakianakis shared this article with you from Inoreader

Abstract

Purpose

To compare marginal bone loss (MBL) around dental implants in a group of bruxers in relation to a matched group of non-bruxers.

Methods

The present record-based retrospective study included patients selected from individuals treated with dental implants at one specialist clinic in Malmö. Only implants not lost and with baseline radiographs taken within 12 months after implant placement and with a minimum of 36 months of radiological follow-up were considered for inclusion. Univariate linear regression models and a linear mixed-effects model were performed.

Results

Two hundred and four patients (104 bruxers, 100 non-bruxers), with a total of 811 implants (416 in bruxers, 395 in non-bruxers) were included in the study. The results of the linear mixed-effects model suggested that bruxism, smoking, age, region of the jaws, implant diameter, and prosthesis type had a statistically significant influence on MBL over time. Individuals who are both bruxers and smokers showed greater MBL when compared to individuals who are either a bruxer or smoker, or neither (p < 0.001).

Conclusions

Bruxism is suggested to increase the risk of MBL over time, as well as higher age, smoking, and the combination of bruxism and smoking. Other factors that showed a correlation with increased MBL were implant diameter, region of the jaws, and prosthesis type, but it is not possible to draw robust conclusions for these factors, as the categories of these variables were very unbalanced.

View on Web

The fate of interneurons, GABAA receptor sub‐types and perineuronal nets in Alzheimer's disease

alexandrossfakianakis shared this article with you from Inoreader

Abstract

Alzheimer's disease (AD) is the most common neurological disease, which is associated with gradual memory loss and correlated with synaptic hyperactivity and abnormal oscillatory rhythmic brain activity that precedes phenotypic alterations and is partly responsible for the spread of the disease pathology. Synaptic hyperactivity is thought to be because of alteration in the homeostasis of phasic and tonic synaptic inhibition, which is orchestrated by the GABAA inhibitory system, encompassing subclasses of interneurons and GABAA receptors, which play a vital role in cognitive functions, including learning and memory. Furthermore, the extracellular matrix, the perineuronal nets (PNNs) which often go unnoticed in considerations of AD pathology, encapsulate the inhibitory cells and neurites in critical brain regions and have recently come under the light for their crucial role in synaptic stabilisation and excitatory-inhibitory balance and when disrupted, serve as a potential trigger for AD-associated synaptic imbalance. Therefore, in this review, we summarise the current understanding of the selective vulnerability of distinct interneuron subtypes, their synaptic and extrasynaptic GABAAR subtypes as well as the changes in PNNs in AD, detailing their contribution to the mechanisms of disease development. We aim to highlight how seemingly unique malfunction in each component of the interneuronal GABA inhibitory system can be tied together to result in critical circuit dysfunction, leading to the irreversible symptomatic damage observed in AD.

View on Web

High‐frequency (10 kHz) Spinal Cord Stimulation (SCS) as a Salvage Therapy for Failed Traditional SCS: A Narrative Review of the Available Evidence

alexandrossfakianakis shared this article with you from Inoreader

Abstract

Introduction

Traditional spinal cord stimulation (t-SCS) has been used to treat chronic pain for over 50 years. However, up to 30% of patients undergo explant, with the main indication being loss of efficacy (LoE), and few alternative treatment options exist for these patients. Strategies to mitigate LoE commonly include conversion to another type of SCS (termed 'salvage' or 'rescue'). This review summarizes the existing literature concerning the efficacy and safety of 10 kHz SCS as a salvage therapy.

Methods

We searched PubMed, the Cochrane Library, ClinicalTrials.gov, and other sources between January 2009 and April 2021. Records were retained if the authors reported clinical outcomes with a minimum of ≥3 months of follow-up in patients implanted with a Senza® 10 kHz SCS system in an effort to treat t-SCS LoE.

Results

Ten articles were eligible for inclusion, reporting 3 prospective studies and 7 retrospective reviews. In the single study that salvaged patients without a repeat trial prior to surgery, 81% of patients were responders (≥50% pain relief from baseline), with mean pain relief of 60%. Among repeat-trial studies, the responder rate ranged from 46% to 80%, and mean pain relief from 47% to 68%. No unanticipated therapy-related safety issues were reported among the included articles.

Conclusion

Preliminary data suggest that chronic back and/or leg pain patients with t-SCS LoE can experience improved and durable pain relief after conversion to 10 kHz SCS. However, additional research is needed to define predictors of success and establish whether salvage without a repeat trial is a viable conversion method.

View on Web

Childcare attendance and risk of infections in childhood and adolescence

alexandrossfakianakis shared this article with you from Inoreader
Abstract
Background
It has been suggested that the transiently increased infection risk following childcare enrolment is compensated by decreased infection risk later in childhood and adolescence. We investigated how childcare enrolment affected rates of antimicrobial-treated infections during childhood and adolescence.
Methods
In a register-based cohort study of all children born in Denmark 1997–2014 with available exposure information (n = 1 007 448), we assessed the association between childcare enrolment before age 6 years and infection risks up to age 20 years, using antimicrobial exposure as proxy for infections. Nationwide childcare and prescription data were used. We estimated infection rates and the cumulative number of infections using adjusted Poisson regression models.
Results
We observed 4 599 993 independent episodes of infection (antimicrobial exposure) du ring follow-up. Childcare enrolment transiently increased infection rates; the younger the child, the greater the increase. The resulting increased cumulative number of infections associated with earlier age at childcare enrolment was not compensated by lower infection risk later in childhood or adolescence. Accordingly, children enrolled in childcare before age 12 months had experienced 0.5–0.7 more infections at age 6 years (in total 4.5–5.1 infections) than peers enrolled at age 3 years, differences that persisted throughout adolescence. The type of childcare had little impact on infection risks.
Conclusions
Early age at childcare enrolment is associated with a modest increase in the cumulative number of antimicrobial-treated infections at all ages through adolescence. Emphasis should be given to disrupting infectious disease transmission in childcare facilities through prevention strategies with particular focus on the youngest children.
View on Web

FACE-Q satisfaction following upper third facial gender-affirming surgery using custom bone-section guides

alexandrossfakianakis shared this article with you from Inoreader
Postoperative satisfaction after facial gender-affirming surgery (FGAS) has not yet been assessed using a validated questionnaire. There is currently no postoperative satisfaction questionnaire specific to transgender patients concerning facial surgery. The contributions of three-dimensional planning in fronto-orbital surgery in trans women and the use of bone cutting guides for facial feminization surgery have been demonstrated. The primary objective of this study was to evaluate postoperative satisfaction with the upper third of the face in trans women using a validated questionnaire – FACE-Q – after fronto-orbital surgery using custom-made bone cutting guides. (Source: International Journal of Oral and Maxillofacial Surgery)
View on Web

Αρχειοθήκη ιστολογίου