ObjectiveTo explore risk factors of blood transfusion during liver transplantation and construct its prediction model. MethodsThe patients underwent liver transplantation who met the inclusion and exclusion criteria of this study from March 2020 to December 2020 in the Beijing Youan Hospital of Capital Medical University were retrospectively collected. The univariate and logistic multivariate analysis were used to evaluate the risk factors of blood transfusion during liver transplantation and construct the prediction model for intraoperative blood transfusion. ResultsA total of 151 eligible liver transplantation patients were collected in this study, including 51 non-transfusion patients and 100 transfusion patients. The univariate analysis results showed that the differences of primary diagnosis, preoperative hemoglobin (Hb), platelet count, prothrombin time, international normalized ratio, Child-Turcotte-Pugh score, and end-stage liver disease (MELD) score were statistically different between them (P<0.05). The above variables selected by the univariate analysis were selected by stepwise method, then the preoperative Hb and MELD score were selected into the multivariate logistic regression analysis, the results showed that the preoperative Hb≤113 g/L and MELD score >14 increased the risk of blood transfusion during liver transplantation [Hb: OR=6.652, 95%CI (2.282, 19.392), P<0.001; MELD score: OR=16.037, 95%CI (6.336, 40.592), P<0.001]. The logistic regression model predicted the area under receiver operating characteristic curve was 0.873 [95%CI (0.808, 0.919), P<0.001], the sensitivity and specificity were 91.0% and 67.5%, respectively, Youden index was 0.674, the accuracy was 86.1%. ConclusionsResults of this study suggest that preoperative Hb ≤113 g/L and MELD score>14 increase risk of blood transfusion during liver transplantation. Logistic regression model constructed according to preoperative Hb and MELD score has a better sensitivity and specificity of intraoperative blood transfusion.
Transcatheter aortic valve replacement (TAVR) has become a common theraputic option for aortic stenosis, but the evidence for precise anatomy for TAVR is accumulating. This paper presents the case of an 71-year-old female patient who had an extremely high risk of coronary obstruction due to both coronary ostia lying too low. The patient underwent TAVR with the help of coronary protection successfully. During the procedure, the patient was protected with wires only for both coronaries. After deployment, angiofluoroscopy suggested that chimney stenting should be applied for left coronary. The whole procedure was unenventful and both coronaries were seen.
Objective To explore the impact of hospital staff’s risk perception on their emergency responses, and provide reference for future responses to public health emergencies. Methods Based on participatory observation and in-depth interviews, the staff of the First Affiliated Hospital of Guangzhou Medical University who participated in the prevention and control of the coronavirus disease 2019 from April to September 2020 were selected. The information on risk perception and emergency responses of hospital staff was collected. Results A total of 61 hospital staff were included. The positions of hospital staff were involved including hospital leading group, hospital office, medical department, logistics support department and outpatient isolation area. The interview results showed that both individual and organizational factors of hospital staff would affect the risk perception of hospital staff, thus affecting the emergency responses of hospital staff, mainly reflected in the psychological and behavioral aspects. Among them, their psychological reactions were manifested as more confidence, sensitivity, and sense of responsibility and mission; The behavior aspects was mainly reflected in the initiation time, execution ability, and standardization level of emergency responses actions. Conclusion Therefore, relevant departments should pay attention to the risk perception of hospital staff, improve the risk perception and emergency responses of hospital staff by influencing the individual and organizational factors of hospital staff, so as to respond more effectively to future public health emergencies and reduce the adverse impact of public health emergencies on the work of hospital staff.
ObjectiveTo systematically summarize the research progress in risk prediction models for postoperative anastomotic leakage in gastric cancer, and to explore the advantages and limitations of models constructed using traditional statistical methods and machine learning, thereby providing a theoretical basis for clinical precision prediction and early intervention. MethodBy analyzing domestic and international literature, the construction strategies of logistic regression, least absolute shrinkage and selection operator (LASSO) regression, and machine learning models (support vector machine, random forest, deep learning) were systematically reviewed, and their predictive performance and clinical applicability were compared. ResultsThe traditional logistic regression and LASSO regression models performed excellently in terms of interpretability and in small-sample scenarios but were limited by linear assumptions. The machine learning models significantly enhanced predictive capabilities for complex data through non-linear modeling and automatic feature extraction, but required larger data scales and had higher demands for interpretability. ConclusionsDifferent prediction models have their own advantages and limitations; in practical clinical applications, they should be flexibly selected or complementarily applied based on specific scenarios. Current anastomotic leakage prediction models are evolving from single factor analysis to multi-modal dynamic integration. Future efforts should combine artificial intelligence and multi-center prospective clinical studies to validate, so advancing the development of precise and individualized anastomotic leakage predictive tools for patients after gastric cancer resection.
Objective To explore the association between cough patterns and cerebrovascular disease risk, and to provide epidemiological evidence for the early diagnosis and prevention of cerebrovascular disease. Methods During the period from 2010 to 2012 in Guizhou Province, a multi-stage proportional stratified cluster sampling method was used to recruit people with the inclusion criteria of the study into a cohort and a baseline questionnaire for demographic information, lifestyle, and disease history was administered. The incidence of cerebrovascular disease was followed up from 2016 to 2020. Results A total of 4804 subjects were followed up, and 4589 (53.5% were female) subjects were enrolled in final investigation. Compared with non-chronic cough group, there was no statistical significance in the risk of cerebrovascular diseases (P>0.05), however, chronic cough (the risk ratio was 2.00 and the 95% confidence interval ranged from 1.08 to 3.69) was twice as likely to develop cerebrovascular disease as non-cough. Conclusions People with chronic cough are more likely to develop cerebrovascular disease than people without cough. More attention to the management and control of cough should be paid to avoid chronic cough, so as to reduce the risk of cerebrovascular diseases.
Objective To analyze the correlation between HLA-A and B genotypes and maculopapular exanthema (MPE) caused by Carbamazepine (CBZ) and Oxcarbazepine (OXC), and to explore the genetic risk factors of MPE. Methods Patients with MPE (rash group) and patients without MPE (non-rash group) after taking CBZ or OXC were retrospectively collected from January 2016 to October 2021 in the Second Affiliated Hospital of Guangzhou Medical University. DNA was extracted from peripheral blood. HLA-A and HLA-B alleles were sequenced by high resolution sequencing, and a case-control study was conducted to analysis the correlations between MPE and HLA genotypes. Results A total of 100 patients with CBZ-MPE, 100 patients with CBZ-tolerant, 50 patients with OXC-MPE, and 50 patients with OXC-tolerant were collected. There was no significant difference in age and sex between CBZ, OXC rash groups and non-rash groups The average latency of CBZ-rash group was (11.31±11.00) days and their average dosage was (348.46±174.10) mg; the average latency of OXC-rash group was (11.67±10.34) days and their average dosage was (433.52±209.22) mg [equivalent to CBZ (289.01±139.48 mg)], showing no significant difference in latency and dosage between CBZ and OXC (P>0.05). The positive rates of HLA-A*24:02 and A*30:01 in CBZ-rash group were 28% and 6%, respectively, which were significantly higher than those in CBZ-non rash group (16% and 0%, both P=0.04). The positive rate of HLA-B*40:01 in CBZ-rash group was 18%, which was significantly lower than that in CBZ-non rash group (40%, P<0.001). No association between HLA-A or B genotype and OXC-rash was found yet. When pooled, it was still found that the positive rates of HLA-A*24:02 and A*30:01 in the rash group were higher than those in the non-rash group, while the positive rate of HLA-B*40:01 in the rash group was lower than that in the non-rash group, and the difference was statistically significant (P<0.05). Conclusions HLA-A*24:02 and A*30:01 were associated with MPE caused by CBZ, and may be common risk factors for aromatic antiepileptic drugs.
Objective To investigate the risk factors for lymph node metastasis in cT1N0M0 stage squamous cell lung cancer and develop a logistic regression model to predict lymph node metastasis. Methods A retrospective study was conducted on patients with cT1N0M0 stage lung squamous cell carcinoma in our department from August 2017 to October 2022. The correlation between basic clinical data, imaging data, and pathological data and lymph node metastasis was analyzed. Univariate and multivariate logistic regression analyses were employed for risk factor analysis. Receiver operating characteristic curves and the Hosmer-Lemeshow test were utilized to evaluate the model’s discrimination and calibration. The Bootstrap method with 1 000 resamples was employed for internal validation of the model. Results Tumor location of central-type, tumor differentiation, cytokeratin 19 fragment (CYFRA21-1) levels, and tumor size were independent risk factors for lymph node metastasis in cT1N0M0 stage squamous cell lung cancer. The optimal cutoff values for tumor size and CYFRA21-1 levels were determined to be 2.05 cm and 4.20 ng/mL, respectively. The combination of tumor location, CYFRA21-1 levels, and tumor size demonstrates superior predictive capability compared to any individual factor. Conclusion Tumor location of central-type, poorly differentiated tumors, CYFRA21-1 levels, and tumor size are risk factors for lymph node metastasis in cT1N0M0 stage lung squamous cell carcinoma. The combined predictive model has certain guiding significance for intraoperative lymph node resection strategies in cT1N0M0 stage lung squamous cell carcinoma.
Objective To analyze and summarize the clinical characteristics of foreign body incarceration in upper digestive tract, and to explore the risk factors of its complications. Methods The clinical data of patients with foreign bodies in the upper digestive tract treated in the Affiliated Hospital of Zunyi Medical University between January 1, 2012 and December 31, 2021 were retrospectively analyzed, including demographic data, foreign body type, incarceration site, incarceration time, causes, symptoms, treatment methods and complications of foreign body incarceration. Logistic regression analysis was used to explore the risk factors of complications. Results A total of 721 patients were finally included, ranging in age from 3 months to 90 years old, with an average age of 26.76 years. The proportion of foreign bodies in the upper digestive tract in patients ≤14 years old was the highest (51.18%), and the duration of foreign body incarceration<12 hours was the highest (55.34%). The most common sharp foreign bodies in the upper digestive tract were animal bones (228 cases), and the most common round shaped foreign bodies were coins (223 cases). The most common impaction site was the upper esophageal segment (85.02%). 105 patients (14.56%) had complications, and perforation was the most common (5.55%). Logistic regression analysis showed that age [odds ratio (OR)=0.523, 95% confidence interval (CI) (0.312, 0.875), P=0.014], foreign body type [OR=0.520, 95%CI (0.330, 0.820), P=0.005], incarceration site [OR=2.347, 95%CI (1.396, 3.947), P=0.001], incarceration time [OR=0.464, 95%CI (0.293, 0.736), P=0.001] were the influencing factors of complications. Conclusions The majority of foreign bodies in the upper digestive tract are animal bones. The incidence of complications increase in patients with age ≥ 60 years, sharp foreign body edges, incarceration in the upper segment of the esophagus, and long incarceration time. It is recommended to remove the sharp foreign bodies incarcerated in the upper segment of the esophagus from the elderly as soon as possible.
ObjectiveTo analyze the prevalence and risk factors of metabolic syndrome (MS) after adult liver transplantation (LT) recipients. MethodsThe clinicopathologic data of patients with survival time ≥1 year underwent LT in the People’s Hospital of Zhongshan City from January 1, 2015 to August 31, 2020 were analyzed retrospectively. The logistic regression model was used to analyze the risk factors affecting MS occurrence after LT, and the receiver operating characteristic (ROC) curve was used to evaluate the optimal cutoff value of the index of predicting MS occurrence and its corresponding evaluation effect. ResultsA total of 107 patients who met the inclusion criteria were collected in this study. Based on the diagnostic criteria of MS of Chinese Medical Association Diabetes Association, the occurrence rate of MS after LT was 32.7% (35/107). Multivariate logistic regression analysis showed that the increased age of the recipient [OR (95%CI)=1.106 (1.020, 1.199), P=0.014], preoperative increased body mass index [OR (95%CI)=1.439 (1.106, 1.872), P=0.007] and blood glucose level [OR (95%CI)=1.708 (1.317, 2.213), P<0.001], and with preoperative smoking history [OR (95%CI)=5.814 (1.640, 20.610), P=0.006] and drinking history [OR (95%CI)=5.390 (1.454, 19.984), P=0.012] increased the probability of MS after LT. The areas under the ROC curve (AUC) corresponding to these five indexes were 0.666, 0.669, 0.769, 0.682, and 0.612, respectively. The corresponding optimal cutoff values of three continuous variables (recipient’s age, preoperative body mass index, and blood glucose level) were 53 years old, 23.1 kg/m2, and 6.8 mmol/L, respectively. The AUC of combination of the above five indexes in predicting occurrence of MS was 0.903 [95%CI (0.831, 0.952)], and the sensitivity and specificity were 80.0% and 90.3%, respectively. ConclusionsIncidence of MS after adult LT recipient is not low. For recipients with preoperative hyperglycemia, obese, elderly, histories of drinking and smoking before LT need to pay attention to the early detection and early intervention of MS.
ObjectiveTo understand the impact of preoperative nutritional status on the postoperative complications for patients with low/ultra-low rectal cancer undergoing extreme sphincter-preserving surgery following neoadjuvant therapy. MethodsThe patients with low/ultra-low rectal cancer who underwent extreme sphincter-preserving surgery following neoadjuvant therapy from January 2009 to December 2020 were retrospectively collected using the Database from Colorectal Cancer (DACCA), and then who were assigned into a nutritional risk group (the score was low than 3 by the Nutrition Risk Screening 2002) and non-nutritional risk group (the score was 3 or more by the Nutrition Risk Screening 2002). The postoperative complications and survival were analyzed for the patients with or without nutritional risk. The postoperative complications were defined as early-term (complications occurring within 30 d after surgery), middle-term (complications occurring during 30–180 d after surgery), and long-term (complications occurring at 180 d and more after surgery). The survival indicators included overall survival and disease-specific survival. ResultsA total of 680 patients who met the inclusion criteria for this study were retrieved from the DACCA database. Among them, there were 500 (73.5%) patients without nutritional risk and 180 (26.5%) patients with nutritional risk. The postoperative follow-up time was 0–152 months (with average 48.9 months). Five hundreds and forty-three survived, including 471 (86.7%) patients with free-tumors survival and 72 (13.3%) patients with tumors survival. There were 137 deaths, including 122 (89.1%) patients with cancer related deaths and 15 (10.9%) patients with non-cancer related deaths. There were 48 (7.1%) cases of early-term postoperative complications, 51 (7.5%) cases of middle-term complications, and 17 (2.5%) cases of long-term complications. There were no statistical differences in the incidence of overall complications between the patients with and without nutritional risk (χ2=3.749, P=0.053; χ2=2.205, P=0.138; χ2=310, P=0.578). The specific complications at different stages after surgery (excluding the anastomotic leakage complications in the patients with nutritional risk was higher in patients without nutritional risk, P=0.034) had no statistical differences between the two groups (P>0.05). The survival curves (overall survival and disease-specific survival) using the Kaplan-Meier method had no statistical differences between the patients with and without nutritional risk (χ2=3.316, P=0.069; χ2=3.712, P=0.054). ConclusionsFrom the analysis results of this study, for the rectal cancer patients who underwent extreme sphincter-preserving surgery following neoadjuvant therapy, the patients with preoperative nutritional risk are more prone to anastomotic leakage within 30 d after surgery. Although other postoperative complications and long-term survival outcomes have no statistical differences between patients with and without nutritional risk, preoperative nutritional management for them cannot be ignored.