Chrome Extension
WeChat Mini Program
Use on ChatGLM

“Veins First” Versus “artery First” Approach for Management of Mixed Arterial Venous Leg Ulcers (MAVLU): Systematic Review and Meta-Analysis

Phlebology(2024)

Discipline of Surgery | Faculty of Medicine

Cited 0|Views1
Abstract
Introduction Mixed Arterial and Venous Leg Ulcers (MAVLU) are challenging. The optimal intervention sequence (artery-first vs vein first) is unclear. This review evaluates current evidence on surgical intervention sequencing. Methods MEDLINE, PUBMED, SCOPUS and EMBASE were searched using the term ‘mixed arterial venous leg ulcers.’ Studies were eligible if they reported ulcer healing outcomes in MAVLU patients. Pooled proportions were calculated by random effects modelling. Results The search yielded 606 studies, eight of which contained sufficient data to include in the analysis. There were no randomized controlled trials. Initial modified compression (MCT) and rescue revascularisation in MAVLU with ABI 0.5 to 0.85 achieved a pooled healing rate of 75% (95% CI 69% to 80%) compared to 79% (95% CI 61% to 93%) in patients with standard VLUs. The pooled rescue revascularisation rate for MAVLU patients with moderate arterial disease was 25% (95% CI 6% to 51%). Patients with severe arterial disease (ABI <0.5) who underwent arterial intervention first were less likely to heal (pooled proportion 40%; 95% confidence interval 16% to 66%). No studies compared either MCT or venous ablation with arterial revascularisation as first-line in patients with moderate arterial disease (ABI 0.5 to 0.85) alone or severe arterial disease (ABI <0.5) alone. There was marked heterogeneity between studies with respect to ulcer healing outcomes reported, definitions of ulcer healing, duration and size of ulcers at presentation, use of adjunct procedures such as skin grafting, unit of measurement (legs vs patients) and duration of follow up. Conclusion A ‘veins first’ approach to MAVLU is plausible but robust data are lacking and should be evaluated in a randomized controlled trial.
More
Translated text
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest