ResearchPad - cardiovascular-procedures https://www.researchpad.co Default RSS Feed en-us © 2020 Newgen KnowledgeWorks <![CDATA[The adipokine vaspin is associated with decreased coronary in-stent restenosis <i>in vivo</i> and inhibits migration of human coronary smooth muscle cells <i>in vitro</i>]]> https://www.researchpad.co/article/elastic_article_7692 Percutaneous coronary intervention represents the most important treatment modality of coronary artery stenosis. In-stent restenosis (ISR) is still a limitation for the long-term outcome despite the introduction of drug eluting stents. It has been shown that adipokines directly influence vessel wall homeostasis by influencing the function of endothelial cells and arterial smooth muscle cells. Visceral adipose tissue-derived serpin vaspin was recently identified as a member of serine protease inhibitor family and serveral studies could demonstrate a relation to metabolic diseases. The aim of this study was to investigate a role of vaspin in the development of in-stent restenosis in vivo and on migration of smooth muscle cells and endothelial cells in vitro.MethodsWe studied 85 patients with stable coronary artery disease who underwent elective and successful PCI with implatation of drug eluting stents. Blood samples were taken directly before PCI. Vaspin plasma levels were measured by specific ELISA. ISR was evaluated eight months later by coronary angiography. Human coronary artery smooth muscle cells (HCASMC) and human umbilical vein endothelial cells (HUVEC) migration was analyzed by an in-vitro migration assay with different concentrations (0.004ng/mL up to 40ng/mL) of vaspin as well as by an scratch assay. For proliferation an impedance measurement with specialiced E-Plates was performed.ResultsDuring the follow up period, 14 patients developed ISR. Patients with ISR had significantly lower vaspin plasma levels compared to patients without ISR (0.213 ng/ml vs 0.382 ng/ml; p = 0.001). In patients with plasma vaspin levels above 1.35 ng/ml we could not observe any restenosis. There was also a significant correlation of plasma vaspin levels and late lumen loss in the stented coronary segments. Further we could demonstrate that vaspin nearly abolishes serum induced migration of HCASMC (100% vs. 9%; p<0.001) in a biphasic manner but not migration of HUVEC. Proliferation of HCASMC and HUVEC was not modulated by vaspin treatment.ConclusionWe were able to show that the adipokine vaspin selectively inhibits human coronary SMC migration in vitro and has no effect on HUVEC migration. Vaspin had no effect on proliferation of HUVEC which is an important process of the healing of the stented vessel. In addition, the occurrence of ISR after PCI with implantation of drug eluting stents was significantly associated with low vaspin plasma levels before intervention. Determination of vaspin plasma levels before PCI might be helpful in the identification of patients with high risk for development of ISR after stent implantation. In addition, the selective effects of vaspin on smooth muscle cell migration could potentially be used to reduce ISR without inhibition of re-endothelialization of the stented segment. ]]> <![CDATA[β-blockers after acute myocardial infarction in patients with chronic obstructive pulmonary disease: A nationwide population-based observational study]]> https://www.researchpad.co/article/5c8823d2d5eed0c4846390b1

Background

Patients with chronic obstructive pulmonary disease (COPD) less often receive β-blockers after acute myocardial infarction (AMI). This may influence their outcomes after AMI. This study evaluated the efficacy of β-blockers after AMI in patients with COPD, compared with non-dihydropyridine calcium channel blockers (NDCCBs) and absence of these two kinds of treatment.

Methods and results

We conducted a nationwide population-based cohort study using data retrieved from Taiwan National Health Insurance Research Database. We collected 28,097 patients with COPD who were hospitalized for AMI between January 2004 and December 2013. After hospital discharge, 24,056 patients returned to outpatient clinics within 14 days (the exposure window). Those who received both β-blockers and NDCCBs (n = 302) were excluded, leaving 23,754 patients for analysis. Patients were classified into the β-blocker group (n = 10,638, 44.8%), the NDCCB group, (n = 1,747, 7.4%) and the control group (n = 11,369, 47.9%) based on their outpatient prescription within the exposure window. The β-blockers group of patients had lower overall mortality risks (adjusted hazard ratio [95% confidence interval]: 0.91 [0.83–0.99] versus the NDCCB group; 0.88 [0.84–0.93] versus the control group), but the risk of major adverse cardiac events within 1 year was not statistically different. β-blockers decreased risks of re-hospitalization for COPD and other respiratory diseases by 12–32%.

Conclusions

The use of β-blockers after AMI was associated with a reduced mortality risk in patients with COPD. β-blockers did not increase the risk of COPD exacerbations.

]]>
<![CDATA[An anatomical study on lumbar arteries related to the extrapedicular approach applied during lumbar PVP (PKP)]]> https://www.researchpad.co/article/5c8823e7d5eed0c48463929f

To observe the regional anatomy of the lumbar artery (LA) associated with the extrapedicular approach applied during percutaneous vertebroplasty (PVP) and percutaneous kyphoplasty (PKP), we collected 78 samples of abdominal computed tomography angiography imaging data. We measured the nearest distance from the center of the vertebral body puncture point to the LA (distance VBPP-LA, DVBPP-LA). According to the DVBPP-LA, four zones, Zone I, Zone II, Zone III and Zone IV, were identified. LAs that passed through these zones were called Type I, Type II, Type III and Type IV LAs, respectively. A portion of the lumbar vertebrae had an intersegmental branch that originated from the upper segmental LA and extended longitudinally across the lateral wall of the pedicle; it was called Type V LA. Compared with the DVBPP-LA in L1, L2, L3 and L4, the overall difference and between-group differences were significant (P < 0.05). In L1, L2, L3, L4 and L5, there were 8, 4, 4, 0 and 1 Type I LAs, respectively. There were no Type V LAs in L1 and L2, but there were 2, 16 and 26 Type V LAs in L3, L4 and L5, respectively. In L1-L5, the numbers of Type I LA plus Type V LA were 8, 4, 6, 16 and 27, and the presence ratios were 5.1%, 2.6%, 5.6%, 10.3% and 17.3%, respectively. In L4 and L5, the male presence ratios of Type I LA plus Type V LA were 7.1% and 10.7%, respectively, and the female presence ratios were 13.9% and 25.0%, respectively. Thus, extrapedicular PVP (PKP) in lumbar vertebrae had a risk of LA injury and was not suggested for use in L4 and L5, especially in female patients.

]]>
<![CDATA[Shrinkage of hepatocellular carcinoma after radiofrequency ablation following transcatheter arterial chemoembolization: Analysis of contributing factors]]> https://www.researchpad.co/article/5c818e88d5eed0c484cc24bb

Objective

This study was conducted to investigate tumor shrinkage and influencing factors in patients with hepatocellular carcinoma (HCC) from radiofrequency (RF) ablation following transcatheter arterial chemoembolization (TACE).

Methods

A total of 222 patients underwent combined sequential treatment of TACE and RF ablation for HCC at our institution between 2008 and 2014. Of those, 86 patients (men, 68; women, 18) who achieved compact iodized oil tagging and complete ablation were included for this retrospective study. We measured three-dimensional tumor diameters and calculated tumor volumes on pre-treatment CT/MRI and follow-up CT scans performed post-TACE, post-ablation, and 1 month post-treatment, respectively. To compare periodically generated tumor diameters and volumes, repeated measures analysis of variance (ANOVA) was applied. Multiple linear regression analysis was performed to identify factors impacting tumor shrinkage after RF ablation.

Results

Diameters and volumes of HCCs declined significantly in the immediate aftermath of RF ablation (i.e., between post-TACE and post-ablation CT scans) (p < 0.001, for both). Mean reduction rates in tumor diameter and volume immediately after RF ablation were 18.2 ± 9.1% and 44.4 ± 14.6%, respectively. Of note, tumors of left hepatic lobe and in subphrenic or perivascular locations showed lower rates of post-ablative volume reduction than those in counterpart locations (p = 0.002, 0.046, 0.024, respectively). Tumor size and liver function did not influence tumor shrinkage after RF ablation.

Conclusion

In patients with HCC, significant tumor shrinkage occurs immediately after RF ablation. The degree of shrinkage in response to ablative treatment seems to vary by tumor location.

]]>
<![CDATA[Apolipoprotein B correlates with intra-plaque necrotic core volume in stable coronary artery disease]]> https://www.researchpad.co/article/5c75ac91d5eed0c484d08a4e

Objective

To determine the relationship between plaque composition and low-density lipoprotein cholesterol (LDL-C), high-density lipoprotein cholesterol (HDL-C), apolipoprotein B (Apo-B), and Apo-A1 using virtual-histology intravascular ultrasound (VH-IVUS).

Methods

We assessed plaque composition in patients with stable coronary artery disease (SCD) admitted to our hospital for percutaneous coronary intervention (PCI) between November 1, 2012, and March 10, 2015. Before PCI, fibrous (FI), fibrofatty (FF), necrotic core (NC), and dense calcium (DC) regions were evaluated using VH-IVUS, and the contributions of each to the culprit lesion volume were recorded. Plasma LDL-C, HDL-C, Apo-B, and Apo-A1 levels were assessed before PCI. The relationship between the regions on VH-IVUS and plasma lipid levels was assessed. Patients were categorized into low Apo-B (LAB) and high Apo-B (HAB) groups, based on the overall cohort median Apo-B level.

Results

We enrolled 115 patients (median Apo-B, 91 mg/dL, male n = 88) with 57 and 58 patients in the LAB (Apo-B ≤ 90 mg/dL) and HAB (Apo-B ≥ 91 mg/dL) groups, respectively. Vessel, plaque, and %NC volumes were significantly greater in the HAB group than in the LAB group. The %FI, %FF, and %DC volumes were similar in both groups. In all 115 patients, the %NC volume correlated with LDL-C (r = 0.2353, P = 0.0114) and Apo-B (r = 0.2487, P = 0.0074) but not with HDL-C and Apo A-1. The high-sensitivity C-reactive protein level tended to be higher in the HAB group than in the LAB group. Multiple regression analysis showed that being male, Apo-A1, and Apo-B were significant predictors of %NC volume extent.

Conclusions

Elevated Apo-B level was related to the %NC in target coronary artery lesions in SCD patients, suggesting a role of Apo-B as a biomarker of unstable plaque in this population.

]]>
<![CDATA[Treatment with mononuclear cell populations improves post-infarction cardiac function but does not reduce arrhythmia susceptibility]]> https://www.researchpad.co/article/5c6f1496d5eed0c48467a361

Background

Clinical and experimental data give evidence that transplantation of stem and progenitor cells in myocardial infarction could be beneficial, although the underlying mechanism has remained elusive. Ventricular tachyarrhythmia is the most frequent and potentially lethal complication of myocardial infarction, but the impact of mono nuclear cells on the incidence of ventricular arrhythmia is still not clear.

Objective

We aimed to characterize the influence of splenic mononuclear cell populations on ventricular arrhythmia after myocardial infarction.

Methods

We assessed electrical vulnerability in vivo in mice with left ventricular cryoinfarction 14 days after injury and intramyocardial injection of specific subpopulations of mononuclear cells (MNCs) (CD11b-positive cells, Sca-1-positive cells, early endothelial progenitor cells (eEPCs)). As positive control group we used embryonic cardiomyocytes (eCMs). Epicardial mapping was performed for analysing conduction velocities in the border zone. Left ventricular function was quantified by echocardiography and left heart catheterization.

Results

In vivo pacing protocols induced ventricular tachycardia (VT) in 30% of non-infarcted mice. In contrast, monomorphic or polymorphic VT could be evoked in 94% of infarcted and vehicle-injected mice (p<0.01). Only transplantation of eCMs prevented post-infarction VT and improved conduction velocities in the border zone in accordance to increased expression of connexin 43. Cryoinfarction resulted in a broad aggravation of left ventricular function. All transplanted cell types augmented left ventricular function to a similar extent.

Conclusions

Transplantation of different MNC populations after myocardial infarction improves left ventricular function similar to effects of eCMs. Prevention of inducible ventricular arrhythmia is only seen after transplantation of eCMs.

]]>
<![CDATA[Development of a risk score for predicting the benefit versus harm of extending dual antiplatelet therapy beyond 6 months following percutaneous coronary intervention for stable coronary artery disease]]> https://www.researchpad.co/article/5c6f1498d5eed0c48467a3af

Background

Decisions on dual antiplatelet therapy (DAPT) duration should balance the opposing risks of ischaemia and bleeding. Our aim was to develop a risk score to identify stable coronary artery disease (SCAD) patients undergoing PCI who would benefit or suffer from extending DAPT beyond 6 months.

Methods

Retrospective analysis of a cohort of patients who completed 6 months of DAPT following PCI. Predictors of ischaemic and bleeding events for the 6–12 month period post-PCI were identified and a risk score was developed to estimate the likelihood of benefiting from extending DAPT beyond 6 months. Incidence of mortality, ischaemic and bleeding events for patients treated with DAPT for 6 vs. 6–12 months, was compared, stratified by strata of the risk score.

Results

The study included 2,699 patients. Over 6 months’ follow up, there were 78 (2.9%) ischaemic and 43 (1.6%) bleeding events. Four variables (heart failure, left ventricular ejection fraction ≤30%, left main or three vessel CAD, status post (s/p) PCI and s/p stroke) predicted ischemic events, two variables (age>75, haemoglobin <10 g/dL) predicted bleeding. In the lower stratum of the risk score, 6–12 months of treatment with DAPT resulted in increased bleeding (p = 0.045) with no decrease in ischaemic events. In the upper stratum, 6–12 months DAPT was associated with reduced ischaemic events (p = 0.029), with no increase in bleeding.

Conclusion

In a population of SCAD patients who completed 6 months of DAPT, a risk score for subsequent ischaemic and bleeding events identified patients likely to benefit from continuing or stopping DAPT.

]]>
<![CDATA[Hierarchical patient-centric caregiver network method for clinical outcomes study]]> https://www.researchpad.co/article/5c6dca0dd5eed0c48452a709

In clinical outcome studies, analysis has traditionally been performed using patient-level factors, with minor attention given to provider-level features. However, the nature of care coordination and collaboration between caregivers (providers) may also be important in determining patient outcomes. Using data from patients admitted to intensive care units at a large tertiary care hospital, we modeled the caregivers that provided medical service to a specific patient as patient-centric subnetwork embedded within larger caregiver networks of the institute. The caregiver networks were composed of caregivers who treated either a cohort of patients with particular disease or any patient regardless of disease. Our model can generate patient-specific caregiver network features at multiple levels, and we demonstrate that these multilevel network features, in addition to patient-level features, are significant predictors of length of hospital stay and in-hospital mortality.

]]>
<![CDATA[The association of preoperative cardiac stress testing with 30-day death and myocardial infarction among patients undergoing kidney transplantation]]> https://www.researchpad.co/article/5c5df353d5eed0c48458115b

Background

Although periodic cardiac stress testing is commonly used to screen patients on the waiting list for kidney transplantation for ischemic heart disease, there is little evidence to support this practice. We hypothesized that cardiac stress testing in the 18 months prior to kidney transplantation would not reduce postoperative death, total myocardial infarction (MI) or fatal MI.

Methods

Using the United States Renal Data System, we identified ESRD patients ≥40 years old with primary Medicare insurance who received their first kidney transplant between 7/1/2006 and 11/31/2013. Propensity matching created a 1:1 matched sample of patients with and without stress testing in the 18 months prior to kidney transplantation. The outcomes of interest were death, total (fatal and nonfatal) MI or fatal MI within 30 days of kidney transplantation.

Results

In the propensity-matched cohort of 17,304 patients, death within 30 days occurred in 72 of 8,652 (0.83%) patients who underwent stress testing and in 65 of 8,652 (0.75%) patients who did not (OR 1.07; 95% CI: 0.79–1.45; P = 0.66). MI within 30 days occurred in 339 (3.9%) patients who had a stress test and in 333 (3.8%) patients who did not (OR 1.03; 95% CI: 0.89–1.21; P = 0.68). Fatal MI occurred in 17 (0.20%) patients who underwent stress testing and 15 (0.17%) patients who did not (OR 0.97; 95% CI: 0.71–1.32; P = 0.84).

Conclusion

Stress testing in the 18 months prior to kidney transplantation is not associated with a reduction in death, total MI or fatal MI within 30 days of kidney transplantation.

]]>
<![CDATA[Pilot study of myocardial ischemia-induced metabolomic changes in emergency department patients undergoing stress testing]]> https://www.researchpad.co/article/5c5df30dd5eed0c484580c11

Background

The heart is a metabolically active organ, and plasma acylcarnitines are associated with long-term risk for myocardial infarction. We hypothesized that myocardial ischemia from cardiac stress testing will produce dynamic changes in acylcarnitine and amino acid levels compared to levels seen in matched control patients with normal stress tests.

Methods

We analyzed targeted metabolomic profiles in a pilot study of 20 case patients with inducible ischemia on stress testing from an existing prospectively collected repository of 357 consecutive patients presenting with symptoms of Acute Coronary Syndrome (ACS) in an Emergency Department (ED) observation unit between November 2012 and September 2014. We selected 20 controls matched on age, sex, and body-mass index (BMI). A peripheral blood sample was drawn <1 hour before stress testing and 2 hours after stress testing on each patient. We assayed 60 select acylcarnitines and amino acids by tandem mass spectrometry (MS/MS) using a Quattro Micro instrument (Waters Corporation, Milford, MA). Metabolite values were log transformed for skew. We then performed bivariable analysis for stress test outcome and both individual timepoint metabolite concentrations and stress-delta metabolite ratios (T2/T0). False discovery rates (FDR) were calculated for 60 metabolites while controlling for age, sex, and BMI. We built multivariable regularized linear models to predict stress test outcome from metabolomics data at times 0, 2 hours, and log ratio between these two. We used leave-one-out cross-validation to estimate the performance characteristics of the model.

Results

Nine of our 20 case subjects were male. Cases’ average age was 55.8, with an average BMI 29.5. Bivariable analysis identified 5 metabolites associated with positive stress tests (FDR < 0.2): alanine, C14:1-OH, C16:1, C18:2, C20:4. The multivariable regularized linear models built on T0 and T2 had Area Under the ROC Curve (AUC-ROC) between 0.5 and 0.55, however, the log(T2/T0) model yielded 0.625 AUC, with 65% sensitivity and 60% specificity. The top metabolites selected by the model were: Ala, Arg, C12-OH/C10-DC, C14:1-OH, C16:1, C18:2, C18:1, C20:4 and C18:1-DC.

Conclusions

Stress-delta metabolite analysis of patients undergoing stress testing is feasible. Future studies with a larger sample size are warranted.

]]>
<![CDATA[Predictors of long-term prognosis in acute kidney injury survivors who require continuous renal replacement therapy after cardiovascular surgery]]> https://www.researchpad.co/article/5c5ca2cdd5eed0c48441eb4f

The long-term prognosis of patients with postoperative acute kidney injury (AKI) requiring continuous renal replacement therapy (CRRT) after cardiovascular surgery is unclear. We aimed to investigate long-term renal outcomes and survival in these patients to determine the risk factors for negative outcomes. Long-term prognosis was examined in 144 hospital survivors. All patients were independent and on renal replacement therapy at hospital discharge. The median age at operation was 72.0 years, and the median pre-operative estimated glomerular filtration rate (eGFR) was 39.5 mL/min/1.73 m2. The median follow-up duration was 1075 days. The endpoints were death, chronic maintenance dialysis dependence, and a composite of death and chronic dialysis. Predictors for death and dialysis were evaluated using Fine and Gray’s competing risk analysis. The cumulative incidence of death was 34.9%, and the chronic dialysis rate was 13.3% during the observation period. In the multivariate proportional hazards analysis, eGFR <30 mL/min/1.73 m2 at discharge was associated with the composite endpoint of death and dialysis [hazard ratio (HR), 2.1; 95% confidence interval (CI), 1.1–3.8; P = 0.02]. Hypertension (HR 8.7, 95% CI, 2.2–35.4; P = 0.002) and eGFR <30 mL/min/1.73 m2 at discharge (HR 26.4, 95% CI, 2.6–267.1; P = 0.006) were associated with dialysis. Advanced age (≥75 years) was predictive of death. Patients with severe CRRT-requiring AKI after cardiovascular surgery have increased risks of chronic dialysis and death. Patients with eGFR <30 mL/min/1.73 m2 at discharge should be monitored especially carefully by nephrologists due to the risk of chronic dialysis and death.

]]>
<![CDATA[Disparities in outcomes of patients admitted with diabetic foot infections]]> https://www.researchpad.co/article/5c61e8bfd5eed0c48496f09c

Objective

The purpose of this study was to evaluate the disparities in the outcomes of White, African American (AA) and non-AA minority (Hispanics and Native Americans (NA)), patients admitted in the hospitals with diabetic foot infections (DFIs).

Research design and methods

The HCUP-Nationwide Inpatient Sample (2002 to 2015) was queried to identify patients who were admitted to the hospital for management of DFI using ICD-9 codes. Outcomes evaluated included minor and major amputations, open or endovascular revascularization, and hospital length of stay (LOS). Incidence for amputation and open or endovascular revascularization were evaluated over the study period. Multivariable regression analyses were performed to assess the association between race/ethnicity and outcomes.

Results

There were 150,701 admissions for DFI, including 98,361 Whites, 24,583 AAs, 24,472 Hispanics, and 1,654 Native Americans (NAs) in the study cohort. Overall, 45,278 (30%) underwent a minor amputation, 9,039 (6%) underwent a major amputation, 3,151 underwent an open bypass, and 8,689 had an endovascular procedure. There was a decreasing incidence in major amputations and an increasing incidence of minor amputations over the study period (P < .05). The risks for major amputation were significantly higher (all p<0.05) for AA (OR 1.4, 95%CI 1.4,1.5), Hispanic (OR 1.3, 95%CI 1.3,1.4), and NA (OR 1.5, 95%CI 1.2,1.8) patients with DFIs compared to White patients. Hispanics (OR 1.3, 95%CI 1.2,1.5) and AAs (OR 1.2, 95%CI 1.1,1.4) were more likely to receive endovascular intervention or open bypass than Whites (all p<0.05). NA patients with DFI were less likely to receive a revascularization procedure (OR 0.6, 95%CI 0.3, 0.9, p = 0.03) than Whites. The mean hospital length of stay (LOS) was significantly longer for AAs (9.2 days) and Hispanics (8.6 days) with DFIs compared to Whites (8.1 days, p<0.001).

Conclusion

Despite a consistent incidence reduction of amputation over the past decade, racial and ethnic minorities including African American, Hispanic, and Native American patients admitted to hospitals with DFIs have a consistently significantly higher risk of major amputation and longer hospital length of stay than their White counterparts. Native Americans were less likely to receive revascularization procedures compared to other minorities despite exhibiting an elevated risk of an amputation. Further study is required to address and limit racial and ethnic disparities and to further promote equity in the treatment and outcomes of these at-risk patients.

]]>
<![CDATA[Low, borderline and normal ankle-brachial index as a predictor of incidents outcomes in the Mediterranean based-population ARTPER cohort after 9 years follow-up]]> https://www.researchpad.co/article/5c5217f0d5eed0c484795af4

Background

Guidelines recommended adopting the same cardiovascular risk modification strategies used for coronary disease in case of low Ankle-brachial index (ABI), but here exist few studies on long-term cardiovascular outcomes in patients with borderline ABI and even fewer on the general population.

Aim

The aim of the present study was to analyze the relationship between long-term cardiovascular events and low, borderline and normal ABI after a 9-year follow up of a Mediterranean population with low cardiovascular risk.

Design and setting

A population-based prospective cohort study was performed in the province of Barcelona, Spain.

Method

A total of 3,786 subjects >49 years were recruited from 2006–2008. Baseline ABI was 1.08 ± 0.16. Subjects were followed from the time of enrollment to the end of follow-up in 2016 via phone calls every 6 months, systematic reviews of primary-care and hospital medical records and analysis of the SIDIAP (Information System for Primary Care Research) database to confirm the possible appearance of cardiovascular events.

Results

3146 individuals participated in the study. 2,420 (77%) subjects had normal ABI, 524 (17%) had borderline ABI, and 202 (6.4%) had low ABI.

In comparison with normal and borderline subjects, patients with lower ABI had more comorbidities, such as hypertension, hypercholesterolemia and diabetes.

Cumulative MACE incidence at 9 years was 20% in patients with low ABI, 6% in borderline ABI and 5% in normal ABI.

The annual MACE incidence after 9 years follow-up was significantly higher in people with low ABI (26.9/1000py) (p<0.001) than in borderline (6.6/1000py) and in normal ABI (5.6/1000py).

Subjects with borderline ABI are at significantly higher risk for coronary disease (HR: 1.58; 95% CI: 1.02–2, 43; p = 0,040) compared to subjects with normal ABI, after adjustment.

Conclusion

The results of the present study support that low ABI was independently associated with higher incidence of MACE, ICE, cardiovascular and no cardiovascular mortality; while borderline ABI had significantly moderate risk for coronary disease than normal ABI.

]]>
<![CDATA[Outcomes of cardiac resynchronization therapy in patients with atrial fibrillation accompanied by slow ventricular response]]> https://www.researchpad.co/article/5c424385d5eed0c4845e0487

It remains unclear as to whether cardiac resynchronization therapy (CRT) would be as effective in patients with atrial fibrillation (AF) accompanied by slow ventricular response (AF-SVR, < 60 beats/min) as in those with sinus rhythm (SR). Echocardiographic reverse remodeling was compared between AF-SVR patients (n = 17) and those with SR (n = 88) at six months and 12 months after CRT treatment. We also evaluated the changes in QRS duration; New York Heart Association (NYHA) functional class; and long-term composite clinical outcomes including cardiac death, heart transplantation, and heart failure (HF)-related hospitalization. Left ventricular pacing sites and biventricular pacing percentages were not significantly different between the AF-SVR and SR groups. However, heart rate increase after CRT was significantly greater in the AF-SVR group than in the SR group (P < 0.001). At six and 12 months postoperation, both groups showed a comparable improvement in NYHA class; QRS narrowing; and echocardiographic variables including left ventricular end-systolic volume, left ventricular ejection fraction, and left atrial volume index. Over the median follow-up duration of 1.6 (interquartile range: 0.8–2.2) years, no significant between-group differences were observed regarding the rates of long-term composite clinical events (35% versus 24%; hazard ratio: 1.71; 95% confidence interval: 0.23–12.48; P = 0.60). CRT implantation provided comparable beneficial effects for patients with AF-SVR as compared with those with SR, by correcting electrical dyssynchrony and increasing biventricular pacing rate, in terms of QRS narrowing, symptom improvement, ventricular reverse remodeling, and long-term clinical outcomes.

]]>
<![CDATA[Comparisons of early vascular reactions in biodegradable and durable polymer-based drug-eluting stents in the porcine coronary artery]]> https://www.researchpad.co/article/5c40f823d5eed0c48438714e

Current drug-eluting stents have abluminal polymer coating; however, thrombus formation in these compared with that in uniformly coated stents remains controversial. We evaluated thrombus formation and early endothelialization after using abluminal biodegradable polymer-coated sirolimus- (BP-SES), and everolimus-eluting stents (BP-EES) versus a durable polymer-coated everolimus-eluting stent (DP-EES) in an in vivo setting. BP-SES, BP-EES, and DP-EES (n = 6 each) were implanted in coronary arteries of 12 mini-pigs that were then sacrificed after 7 and 10 days. Stents were stained with hematoxylin and eosin, and a combined Verhoeff and Masson trichrome stain. Areas of fibrin deposition were digitally detected and measured with off-line morphometric software. Stents were investigated for re-endothelialization by transmission electron microscopy. At 7 days, histological analysis revealed the lowest area of fibrin deposition in BP-SES (BP-SES vs. BP-EES vs. DP-EES; 0.10 ± 0.06 mm2 vs. 0.15 ± 0.07 mm2 vs. 0.19 ± 0.06 mm2, p = 0.0004). At 10 days, the area of fibrin deposition was significantly greater in DP-EES (0.13 ± 0.04 mm2 vs. 0.14 ± 0.05 mm2 vs. 0.19 ± 0.08 mm2, p = 0.007). Endothelial cells in BP-SES demonstrated a significantly greater number of tight junctions than those in DP-EES according to by transmission electron microscopy for both days (p<0.05). Various parameters, including an inflammatory reaction and neointimal formation, were comparable among the groups at 7 and 10 days. An abluminal biodegradable polymer-coated SES showed the least fibrin deposition and greatest endothelial cell recovery at an early stage following implantation in the coronary arteries of mini-pigs.

]]>
<![CDATA[The axillary vein and its tributaries are not in the mirror image of the axillary artery and its branches]]> https://www.researchpad.co/article/5c40f76dd5eed0c484386118

Introduction

The axillary and cephalic veins are used for various clinical purposes but their anatomy is not fully understood. Increased knowledge and information about them as well as superficial veins in the upper arm would be useful.

Objective

The aim of this study is to contribute to the literature regarding the anatomy of the venous drainage of the upper extremity.

Methods

The veins of forty upper extremities from twenty one adult cadavers were injected and their axillary regions dissected. The course and pattern of drainage of the venous tributaries in the axillary region were identified and recorded.

Results

The basilic, brachial, subscapular, lateral thoracic and superior thoracic veins drained mainly into the axillary vein, in common with most textbook descriptions. However, the thoracoacromial veins were observed to drain into the cephalic vein in 70.0% of upper limbs. In addition, a venous channel connecting the distal part and proximal part of the axilla was found along the posterolateral wall of the axilla in 77.5% of the upper limbs. In 95.0% of upper limbs, we discovered a superficial vein which ran from the axillary base and drained directly into the axillary vein.

Conclusion

The veins from the inferomedial part of the axilla drain into the axillary vein, whereas the veins from the superolateral part of the axilla drain into the cephalic vein. The venous drainage of the axilla is variable and in common with venous drainage elsewhere, does not necessarily follow the pattern of the arterial supply.

]]>
<![CDATA[Competing risks of major bleeding and thrombotic events with prasugrel-based dual antiplatelet therapy after stent implantation - An observational analysis from BASKET-PROVE II]]> https://www.researchpad.co/article/5c478c6dd5eed0c484bd2457

Background

Dual antiplatelet therapy (DAPT) prevents thrombotic events after coronary stent implantation but may induce bleedings, specifically in elderly patients. However, a competitive risk analysis is lacking.

Objectives

To assess the determinants of major bleeding and the balance between the competing risks of major bleeding and thrombotic events during prasugrel-based DAPT after stent implantation.

Methods

Overall, 2,291 patients randomized to drug-eluting or bare metal stents and treated with prasugrel 10mg/day for 1 year were followed over 2 years for major bleeding (BARC 3/5) and thrombotic events (cardiac death, myocardial infarction, definitive/probable stent thrombosis). Prasugrel dose was reduced to 5mg in patients >75 years and/or <60kg. Predictors of major bleeding and competing risks of major bleeding and thrombotic events were assessed.

Results

Two-year rates of major bleeding and thrombotic events were 2.9% and 9.0%, respectively. The only independent predictor of major bleeding was age (hazard ratio per year increase 1.05 [1.02,1.07], p<0.001). The relationship between major bleeding and age was non-linear, with lowest hazard ratios at 57 years and an exponential increase only above 65 years. In contrast, the relationship between thrombotic events and age was linear and continuously increasing with older age. While the competing risk of thrombotic events was higher than that of major bleeding in younger patients, the two risks were similar in older patients. After discontinuation of prasugrel, bleeding events leveled off in all patients, while thrombotic events continued to increase.

Conclusions

In prasugrel-based DAPT, age is the strongest risk factor for major bleeding, increasing exponentially >65 years. In younger patients, thrombotic events represent a higher risk than bleeding, while thrombotic and bleeding risks were similar in older patients. Important clinical implications relate to prasugrel dose in the elderly, duration of DAPT and the competing risk balance necessitating individualized treatment decisions.

]]>
<![CDATA[Temporal trends in prevalence and antithrombotic treatment among Asians with atrial fibrillation undergoing percutaneous coronary intervention: A nationwide Korean population-based study]]> https://www.researchpad.co/article/5c478c3ad5eed0c484bd0ebc

Background

We investigated the recent 10-year trends in the number of patients with atrial fibrillation (AF) undergoing percutaneous coronary intervention (PCI) in relation to prescription patterns of antithrombotic therapy.

Methods

We analyzed the annual prevalence of PCI and patterns of antithrombotic therapy after PCI, including antiplatelets and oral anticoagulants (vitamin K antagonists and non-vitamin K antagonist oral anticoagulants [NOACs]), in patients with AF between 2006 and 2015 by using the Korean National Health Insurance Service database. Independent factors associated with triple therapy (oral anticoagulant plus dual antiplatelet) prescription were assessed using multivariable logistic regression analysis.

Results

The number of patients with AF undergoing PCI increased gradually from 2006 (n = 2,140) to 2015 (n = 3,631) (ptrend<0.001). In 2006, only 22.7% of patients received triple therapy after PCI although 96.2% of them were indicated for anticoagulation (CHA2DS2-VASc score ≥2). The prescription rate of triple therapy increased to 38.3% in 2015 (ptrend<0.001), which was mainly attributed to a recent increment of NOAC-based triple therapy from 2013 (17.5% in 2015). Previous ischemic stroke or systemic embolism, old age, hypertension, and congestive heart failure were significantly associated with a higher triple therapy prescription rate, whereas previous myocardial infarction, PCI, and peripheral arterial disease were associated with triple therapy underuse.

Conclusions

From 2006 to 2015, the number of patients with AF undergoing PCI and the prescription rate of triple therapy increased gradually with a recent increment of NOAC-based antithrombotic therapy from 2013. Previous myocardial infarction, peripheral artery disease, and PCI were associated with underuse of triple therapy.

]]>
<![CDATA[Relationship between FFR, CFR and coronary microvascular resistance – Practical implications for FFR-guided percutaneous coronary intervention]]> https://www.researchpad.co/article/5c3d017ad5eed0c48403bbb1

Objective

The aim was threefold: 1) expound the independent physiological parameters that drive FFR, 2) elucidate contradictory conclusions between fractional flow reserve (FFR) and coronary flow reserve (CFR), and 3) highlight the need of both FFR and CFR in clinical decision making. Simple explicit theoretical models were supported by coronary data analyzed retrospectively.

Methodology

FFR was expressed as a function of pressure loss coefficient, aortic pressure and hyperemic coronary microvascular resistance. The FFR-CFR relationship was also demonstrated mathematically and was shown to be exclusively dependent upon the coronary microvascular resistances. The equations were validated in a first series of 199 lesions whose pressures and distal velocities were monitored. A second dataset of 75 lesions with pre- and post-PCI measures of FFR and CFR was also analyzed to investigate the clinical impact of our hemodynamic reasoning.

Results

Hyperemic coronary microvascular resistance and pressure loss coefficient had comparable impacts (45% and 49%) on FFR. There was a good concordance (y = 0.96 x − 0.02, r2 = 0.97) between measured CFR and CFR predicted by FFR and coronary resistances. In patients with CFR < 2 and CFR/FFR ≥ 2, post-PCI CFR was significantly >2 (p < 0.001), whereas it was not (p = 0.94) in patients with CFR < 2 and CFR/FFR < 2.

Conclusion

The FFR behavior and FFR-CFR relationship are predictable from basic hemodynamics. Conflicting conclusions between FFR and CFR are explained from coronary vascular resistances. As confirmed by our results, FFR and CFR are complementary; they could jointly contribute to better PCI guidance through the CFR-to-FFR ratio in patients with coronary artery disease.

]]>
<![CDATA[Relationship between the extent of aortic replacement and stent graft for acute DeBakey type I aortic dissection and outcomes: Results from a medical center in Taiwan]]> https://www.researchpad.co/article/5c390bbcd5eed0c48491e113

Background

Total arch replacement (TAR) and/or stent graft implantation has been proposed as the primary surgical treatment for acute DeBakey type I aortic dissection. However, the suggestion was based on excellent outcomes of high-volume or aortic centers. How about the real results in most places around the world? The purpose of this study was intended to compared in-hospital mortality, major complications, and aortic remodeling between TAR and/or stent graft implantation in a medical center of northern Taiwan.

Methods

Between January 2008 and August 2017, 156 patients with acute type I aortic dissection underwent surgery at our institution, including proximal aortic replacement only (Group I, n = 72), concomitant TAR (Group II, n = 23), concomitant TAR extended with stent grafting (Group III, n = 45), and proximal aortic replacement with descending aortic stent grafting (Group IV, n = 16).

Results

No significant differences were found in underlying disease and preoperative presentations, including operative risk among four groups. Overall in-hospital mortality was 22.4% (13 patients in Group I, 9 in Group II, 12 in Group III, and 1 in Group IV). New-onset stroke occurred in 15 patients postoperatively (3 patients [5.2%] in Group I, 3 [21.4%] in Group II, and 9 [26.5%] in Group III after excluding 36 patients with documented preoperative cerebrovascular accident or cerebral malperfusion). Root reconstruction and TAR were significantly associated with in-hospital mortality. TAR was significantly associated with surgery-related stroke. Compared to those in Group I, true lumen expansion and false lumen shrinkage during 1-year aortic remodeling were significantly higher in Groups III and IV. Both TAR and descending aorta stent grafting were significantly associated with decreased risk of patent false lumen.

Conclusions

Proximal aortic replacement remains the preferred surgical strategy for acute type I aortic dissection, with lower mortality and neurological complications. Proximal descending aorta stent grafting may benefit aortic remodeling, even without TAR.

]]>