• IMA sites
  • IMAJ services
  • IMA journals
  • Follow us
  • Alternate Text Alternate Text
עמוד בית
Wed, 29.05.24

Search results


March 2023
Eyal Leibovitz MD, Mona Boaz PhD, Israel Khanimov MD, Gary Mosiev MD, Mordechai Shimonov MD

Background: Despite its wide use, evidence is inconclusive regarding the effect of percutaneous endoscopic gastrostomy (PEG) in patients with chronic diseases and dementia among hospitalized patients with malnutrition.

Objectives: To examine the effect of PEG insertion on prognosis after the procedure.

Methods: This retrospective analysis of medical records included all adult patients who underwent PEG insertion between 1 January 2009 and 31 December 2013 during their hospitalization. For each PEG patient, two controls similar in age, sex, referring department, and underlying condition were randomly selected from the entire dataset of patients admitted. The effect of PEG on mortality and repeated admissions was examined.

Results: The study comprised 154 patients, 49 referred for PEG insertion and 105 controls (mean age 74.8 ± 19.8 years; 72.7% females; 78.6% admitted to internal medicine units). Compared to controls, the PEG group had a higher 2-year mortality rate (59.2% vs. 17.1%, P < 0.001) but the 2-year readmission rate did not differ significantly (44.9% vs. 56.2% respectively, P = 0.191). Regression analysis showed PEG was  associated with increased risk of the composite endpoint of death or readmission (hazard ratio 1.514, 95% confidence interval 1.016–2.255, P = 0.041). No specific characteristic of admission was associated with increased likelihood of death or readmission. Among readmitted patients, reasons for admission and baseline laboratory data, including albumin and cholesterol, did not differ between the PEG patients and controls.

Conclusions: In-hospital PEG insertion was associated with increased mortality at 2 years but had no effect on readmissions.

January 2014
Mona Boaz, Alexander Bermant, Tiberiu Ezri, Dror Lakstein, Yitzhak Berlovich, Iris Laniado RN and Zeev Feldbrin
Background: Surgical adverse events are errors that emerge during perioperative patient care. The World Health Organization recently published “Guidelines for Safe Surgery.”

Objectives: To estimate the effect of implementation of a safety checklist in an orthopedic surgical department.


Methods: We conducted a single-center cross-sectional study to compare the incidence of complications prior to and following implementation of the Guidelines for Safe Surgery checklist. The medical records of all consecutive adult patients admitted to the orthopedics department at Wolfson Medical Center during the period 1 July 2008 to 1 January 2009 (control group) and from 1 January 2009 to 1 July 2009 (study group) were reviewed. The occurrences of all complications were compared between the two groups.

Results: The records of 760 patients (380 in each group) hospitalized during this 12 month period were analyzed. Postoperative fever occurred in 5.3% vs. 10.6% of patients with and without the checklist respectively (P = 0.008). Significantly more patients received only postoperative prophylactic antibiotics rather than both pre-and postoperative antibiotic treatment prior to implementation of the checklist (3.2% vs. 0%, P = 0.004). In addition, a statistically non-significant 34% decrease in the rate of surgical wound infection was also detected in the checklist group. In a logistic regression model of postoperative fever, the checklist emerged as a significant independent predictor of this outcome: odds ratio 0.53, 95% confidence interval 0.29–0.96, P = 0.037.

Conclusion: A significant reduction in postoperative fever after the implementation of the surgical safety checklist was found. It is possible that the improved usage of preoperative prophylactic antibiotics may explain the reduction in postoperative fever.

August 2012
A. Ballin, Y. Senecky, U. Rubinstein, E. Schaefer, R. Peri, S. Amsel, M. Vol, Y. Amit and M. Boaz

Background: The pathogenesis of anemia associated with acute infection in children has not been well delineated.

Objectives: To characterize this type of anemia in children with acute infection, mainly in relation to iron status.

Methods: These two cross-sectional studies compared the prevalence and severity of anemia between outpatient febrile children and age-matched non-febrile controls.

Results: In part 1 of the study, children with acute infection (n=58) had a significant decrease in hemoglobin levels compared with 54 non-febrile controls. Mean corpuscular volume (MCV) did not change this association. Moreover, there was no significant difference in MCV, mean cell hemoglobin or red cell distribution width values between the two groups. Regarding part 2, of the 6534 blood counts obtained in community clinics, 229 were defined as “bacterial infection.” Chart survey confirmed this diagnosis. White blood cell level was significantly inversely associated with hemoglobin level (r = -0.36, P < 0.0001). Anemia was significantly more prevalent among children with bacterial infection compared to those without: 21.4% vs. 14.1% (P = 0.002). Mean values of iron status parameters were all within normal limits.

Conclusions: Acute illness is associated with anemia. The pathogenesis of this anemia does not appear to be associated with disruption of iron metabolism.

July 2012
S. Giryes, E. Leibovitz, Z. Matas, S. Fridman, D. Gavish, B. Shalev, Z. Ziv-Nir, Y. Berlovitz and M. Boaz
Background: Depending on the definition used, malnutrition is prevalent among 20¨C50% of hospitalized patients. Routine nutritional screening is necessary to identify patients with or at increased risk for malnutrition. The Nutrition Risk Screening (NRS 2002) has been recommended as an efficient tool to identify the risk of malnutrition in adult inpatients.

Objectives: To utilize the NRS 2002 to estimate the prevalence of malnutrition among newly hospitalized adult patients, and to identify risk factors for malnutrition.

Methods: During a 5 week period, all adult patients newly admitted to all inpatient departments (except Maternity and Emergency) at Wolfson Medical Center, Holon, were screened using the NRS 2002. An answer of yes recorded for any of the Step 1 questions triggered the Step 2 screen on which an age-adjusted total score ¡Ý 3 indicated high malnutrition risk.

Results: Data were obtained from 504 newly hospitalized adult patients, of whom 159 (31.5%) were identified as high risk for malnutrition. Malnutrition was more prevalent in internal medicine than surgical departments: 38.6% vs. 19.1% (P < 0.001). Body mass index was within the normal range among subjects at high risk for malnutrition: 23.9 ¡À 5.6 kg/m2 but significantly lower than in subjects at low malnutrition risk: 27.9 ¡À 5.3 kg/m2 (P < 0.001). Malnutrition risk did not differ by gender or smoking status, but subjects at high malnutrition risk were significantly older (73.3 ¡À 16.2 vs. 63.4 ¡À 18.4 years, P < 0.001). Total protein, albumin, total cholesterol, low density lipoprotein-cholesterol, hemoglobin and %lymphocytes were all significantly lower, whereas urea, creatinine and %neutrophils were significantly higher in patients at high malnutrition risk.

Conclusions: Use of the NRS 2002 identified a large proportion of newly hospitalized adults as being at high risk for malnutrition. These findings indicate the need to intervene on a system-wide level during hospitalization.
August 2010
April 2008
F. Serour, A. Gorenstein and M. Boaz

Background: Reports of burn injuries in children are usually made by highly specialized burn units. Our facility admits children with burns < 20% total body surface area, while those with major burns are transferred to burn units at tertiary care facilities.

Objectives: To review our experience with thermal burns.

Methods: We conducted a retrospective review of all thermal burns admitted to our hospital during a 5 year period.

Results: Among 266 patients (69.2% boys) aged 3.5 ± 3.6 years, children < 3 years old were the most frequently injured (64.7%). Scalds (71.4%) were the most common type of burn. Partial thickness burns were sustained by 96.6% of children and TBSA[1] burned was 4.2 ± 3.6%. The mean hospital stay was 3.8 ± 4.5 days, and was significantly prolonged in girls (4.6 ± 4.8 vs. 3.5 ± 4.3 days, P = 0.01). Percent TBSA burned was correlated with patient age (r = 0.12, P = 0.04) and length of hospital stay (r = 0.6, P < 0.0001). Six patients (2.3%) (mean age 3.4 ± 2.3 years) were hospitalized in the Pediatric Intensive Care Unit due to toxin-mediated illness.

Conclusions: Children under the age of 3 years are at increased risk for burn injury, but older children sustain more extensive injuries. Prevention and awareness are needed for child safety.






[1] TBSA = total body surface area


February 2006
E. Leshinsky-Silver, S. Cheng, M.A. Grow, S. Schwartz, L. Scharf, D. Lev, M. Boaz, D. Brunner and R. Zimlichman

Background: Cardiovascular disease is now well established as a multifactorial disease. In a given individual, the level of cardiovascular risk is due to the interaction between genetic and environmental components. The BIP cohort comprised 3000 patients with cardiovascular disease who were tested for the benefits of bezafibrate treatment. This cohort has the data for the lipid profile of each individual, fibrinogen, Insulin, as well as clinical, demographic and lifestyle parameters

Objectives: To genotype up to 64 variable sites in 36 genes in the BIP cohort. The genes tested in this assay are involved in pathways implicated in the development and progression of atherosclerotic plaques, lipid and homocystein metabolism, blood pressure regulation, thrombosis, rennin-angiotensin system, platelet aggregation, and leukocyte adhesion.

Methods:  DNA was extracted from 1000 Israeli patients from the BIP cohort. A multilocus assay, developed by the Roche Molecular System, was used for genotyping. Allele frequencies for some of the markers were compared to the published frequencies in a healthy population (the French Stanislas cohort, n=1480).

Results: Among the 26 comparable alleles checked in the two cohorts, 16 allele frequencies were significantly different from the healthy French population: ApoE (E3, E2, E4), ApoB (71ile), ApoC (3482T, 455C, 1100T, 3175G, 3206G), CETP (405val), ACE (Del), AGT (235thr), ELAM (128arg); p<0001 and LPL (93G, 291Ser, 447ter); p < 005.

Conclusions: Although a comparable healthy Israeli population study is needed for more precise interpretation of these results, frequency differences in these polymorphic alleles, associated with lipid metabolism, renin-angiotensin system and leukocyte adhesion mechanism, between CVD patients and healthy individuals nevertheless implicate these candidate genes as predisposing for CVD.lic safety.
 

November 2005
Z. Katzir, A. Michlin, M. Boaz, A. Biro and S. Smetana
 Background: During maintenance hemodialysis acute elevation in serum calcium is common. Low calcium dialysis is advocated as a therapy for prevention of dialysis-induced hypercalcemia. Approximately 16% of our chronic hemodialysis patients experience elevated arterial blood pressure during the hemodialysis session, becoming hypertensive by the end of the treatment. All these patients exhibited post-dialysis hypercalcemia.

Objectives: To investigate the effect of low calcium dialysis on post-dialysis hypertension in view of an evident link between serum calcium and blood pressure in both normal renal function and chronic renal failure patients.

Methods: We evaluated 19 chronic hemodialysis patients in whom both post-dialysis hypertension and PDHCa[1] were observed. We investigated changes in serum total calcium, ionized calcium, intact parathormone levels and arterial blood pressure in response to 4 weeks low calcium dialysis as a treatment for PDHCa.

Results: When PDHT[2] patients were treated with low calcium dialysis, post-dialysis blood pressure was significantly decreased compared to pre-dialysis values (155.3 ± 9.7/82.2 ± 7.9 mmHg pre-dialysis vs. 134.1 ± 20.8/80 ± 8.6 mmHg post-dialysis, P = 0.001). Additionally, post-dialysis blood pressure was significantly lower than post-dialysis blood pressure prior to the low calcium dialysis treatment (176.1 ± 15/86 ± 10.8 mmHg post-standard dialysis, 134.1 ± 20.8/80 ± 8.6 mmHg after low calcium dialysis, P = 0.001). A decline in post-dialysis serum calcium (2.34 ± 0.2 vs. 2.86 ± 0.12 mmol/L, P = 0.04) and ionized calcium (1.17 ± 0.12 vs. 1.3 ± 0.06 mmol/L, P = 0.03) compared to pre-dialysis levels was also achieved by this treatment, with no significant changes in iPTH[3] levels.

Conclusions: These data suggest a role for low calcium dialysis in treating acute serum calcium elevation and post-dialysis hypertension in patients receiving maintenance hemodialysis.


 



[1] PDHCa = post-dialysis hypercalcemia

[2] PDHT = post-dialysis hypertension

[3] iPTH = intact parathormone


October 2003
M. Boaz, S. Smetana, Z. Matas, A. Bor, I. Pinchuk, M. Fainaru, M.S. Green and D. Lichtenberg

Background: In lipid oxidation kinetics studies, prevalent cardiovascular disease has been associated with shortened lag phase, the length of time preceding the onset of oxidation.

Objectives: To examine, in vitro, copper-induced lipid oxidation kinetics in unfractionated serum from hemodialysis patients and to determine differences in kinetic parameters between patients with and without a history of CVD[1].

Methods: Of the 76 patients enrolled in a study of oxidative stress in hemodialysis (44/76 with prevalent CVD, 53/76 males), 9 males with a history of myocardial infarction were selected and matched for age, diabetes and smoking status with 9 males from the non-CVD group. The kinetics of lipid oxidation was studied. Blood chemistry determinations including serum lipids, lipoproteins, hemostatic factors and serum malondialdehyde were obtained. Variables were compared using the t-test for independent samples with history of MI[2] entered as the categorical variable.

Results: Tmax, the oxidation kinetic parameter defined as the time at which the rate of absorbing product accumulation was maximal, was significantly shorter in dialysis patients with a history of MI than in those without (115.2 ± 38.5 vs. 162.7 ± 48.9 minutes, P = 0.04). Further, Tmax and MDA[3] were negatively correlated to one another (r = -0.47, P = 0.04). Odds ratios indicate that each 1 minute increase in Tmax was associated with a 3% decrease in odds that a subject had a history of MI.

Conclusions: These findings indicate the presence of increased oxidative stress in hemodialysis patients with a history of MI.






[1] CVD = cardiovascular disease



[2] MI = myocardial infarction



[3] MDA = malondialdehyde


Legal Disclaimer: The information contained in this website is provided for informational purposes only, and should not be construed as legal or medical advice on any matter.
The IMA is not responsible for and expressly disclaims liability for damages of any kind arising from the use of or reliance on information contained within the site.
© All rights to information on this site are reserved and are the property of the Israeli Medical Association. Privacy policy

2 Twin Towers, 35 Jabotinsky, POB 4292, Ramat Gan 5251108 Israel