Showing posts with label Research Highlights. Show all posts
Showing posts with label Research Highlights. Show all posts

Thursday, February 19, 2015

Research Highlights: Review of Jian Ling Decoction Efficacy for Treating Hypertension Shows Inconclusive but Potentially Promising Results


Review of Jian Ling Decoction Efficacy for Treating Hypertension Shows Inconclusive but Potentially Promising Results

Hypertension is a massive public health issue today. It leads to increased risk for cardiovascular and renal disease, and is currently ranked as a leading risk factor for mortality globally despite being largely preventable. It is thus vital to develop effective treatments for hypertension to relieve symptoms early. In East Asia, practitioners of traditional Chinese medicine often prescribe Jian Ling Decoction (JLD), a mixture of eight common herbs, to treat essential hypertension.
Recent trials have suggested that JLD may effectively relieve hypertension symptoms, but these trials are small and their results are often unclear or inconclusive. A recent study published in the British Journal of Medicine (BMJ) has compiled a comprehensive review of many trials to better evaluate the overall efficacy of JLD as a hypertension treatment.
This study reviewed ten randomized controlled trials with 655 participants total. Trials were only included in this review if they focused on patients who met the diagnostic criteria for essential hypertension, and if they tested the effects of JLD against the effects of another drug for hypertension treatment. JLD produced no serious side effects in patients in any of these trials.
After compiling the results of all these trials, researchers found some evidence to suggest that JLD can greatly reduce systolic and diastolic blood pressure in hypertensive patients, but this result was only significant in trials where JLD was combined with an existing hypertension treatment in the experimental group. Trial results also suggested that JLD may improve long term quality of life in hypertensive patients. Overall, though results were inconclusive, there is some evidence that JLD may ameliorate certain hypertension symptoms and future studies are needed to explore these findings further.
This study was not without limitations. The trials looked at were varied in their methods and scope, and they often had small sample sizes. The number of trials reviewed was also fairly small. This review is still important, however, as it paves the way for further research into the potential efficacy of JLD and explores the vital issue of treating hypertension globally.


BMJ Open. 2015;5:e006502 doi:10.1136/bmjopen-2014-006502

Caroline Russell-Troutman is the 2014-2015 Research Highlights Editor.

Tuesday, February 10, 2015

Research Highlights: New Device Implanted in Coronary Sinus May Improve Symptoms of Refractory Angina


New Device Implanted in Coronary Sinus May Improve Symptoms of Refractory Angina 

A recent study published in the New England Journal of Medicine (NEJM) has investigated the efficacy of a new device designed to relive refractory angina in patients who suffer from coronary artery disease. The device is implanted in the coronary sinus where it increases pressure to relieve pain caused by angina.
104 patients at 11 different clinical centers participated in this study. All patients were over 18 years of age and had been diagnosed with class III or IV angina that had not improved with existing therapies. Half of the participants had the device implanted while the other half had a sham device implanted and served as the control group. This study was double blind so neither participants themselves nor the researchers (with the exception of the physicians who performed the implantations) knew which participants were in which group. Over the course of 3 years, researchers monitored patient angina by measuring cardiac wall motion and by asking participants to rate their symptoms in a questionnaire.
Results showed significant improvement in at least one angina class in 71% of the treatment group compared to only 42% of the control group. 35% of treatment group participants further reported improvement in 2 or more angina classes compared to 15% of control group participants. Thus, the implantation of this device was associated with greater relief from angina-related symptoms.
As heart disease rates and life expectancies for patients with coronary heart disease rise in the Western world, rates of refractory angina rise as well. It is thus vital to develop improved therapies for the symptoms of angina, since many current treatments are not very effective. However, despite its clinical importance, this study was not without limitations. The study relied heavily on patient self-report and the number of participants was relatively small (n = 104). Further studies are needed to confirm the efficacy of the device and to understand its applications on a wider scale, but for now this coronary sinus implantation device shows promise as an effective treatment for refractory angina.

N Engl J Med. 2015; 372:519-527. doi: 10.1056/NEJMoa1402556.

Caroline Russell-Troutman is the 2014-2015 Research Highlights Editor.

Monday, October 20, 2014

Research Highlights: Evidence Suggests That Recent Outbreak of Ebola Virus in the Democratic Republic of Congo is Independent From West African Epidemic

Evidence Suggests That Recent Outbreak of Ebola Virus in the Democratic Republic of Congo is Independent From West African Epidemic

As international panic regarding the recent West African Ebola virus outbreak spreads, containment of the disease has become increasingly important. On August 24th, the World Health Organization confirmed yet another outbreak of Ebola in a city in the Democratic Republic of Congo (DRC), a country quite distant from the Western African nations where this year’s Ebola epidemic originated. A recent study published in the New England Journal of Medicine has investigated whether the recent DRC outbreak is the result of spillover from West Africa or if it is a new and independent outbreak. Researchers then further investigated the nature of the DRC outbreak (e.g. its rate of transmission, geographic distribution, etc.) compared to other African Ebola variants.
Researchers collected blood samples from eight patients in the DRC outbreak region who were showing symptoms of Ebola. The participants gave their informed consent for this procedure. Samples were sent to the Institut National de Recherche Biomédicale in Kinshasa as well as the WHO reference center for viral hemorrhagic fever in order to analyze the virus via real-time PCR. Results showed that Ebola species from the recent DRC outbreak is 99.2% related to Ebola from a 1995 outbreak in a nearby region of the country. Researchers have thus concluded that there is no epidemiological link between the DRC outbreak and the current West African outbreak; evidence strong suggests that the two outbreaks are distinct and independent. Fortunately, this finding has led researchers to believe that the current DRC outbreak will run a similar course to past DRC Ebola outbreaks, which have typically had lower rates of spillover into neighboring nations and less aggressive transmission rates compared the current Ebola epidemic in West Africa. Past incidences of Ebola outbreak in the DRC have generally been brought under control within a few months, so researchers expect the current DRC outbreak to be much more easily contained than the West African epidemic.
This study raises questions about how best to control Ebola in West Africa and why the risk of widespread transmission is so much lower in the DRC and other countries in Equatorial Africa. Researchers noted that differences in population density, cultural norms, and response to epidemics between these two regions of Africa could account for some of the differences in Ebola risk. However, these potential explanations must be tested in future studies before researchers can propose a better course of action for controlling Ebola in West African nations. 


NEJM [Internet]. October 14, 2014 [cited October 20, 2014]; Available from: http://www.nejm.org/doi/full/10.1056/NEJMoa1411099#t=articleDiscussion
doi: 10.1056/NEJMoa1411099



Caroline Russell-Troutman is the 2014-2015 Research Highlights Editor


Sunday, September 21, 2014

Research Highlights: Temporary Blocking of the Intra-Abdominal Vagal Nerve Not Effective Enough to Replace Bariatric Surgery as a Treatment for Obesity

Temporary Blocking of the Intra-Abdominal Vagal Nerve Not Effective Enough to Replace Bariatric Surgery as a Treatment for Obesity

Bariatric surgery is a common treatment for weight loss in obese patients, but it can carry significant risks such as increased morbidity and an unappealing distortion of the body. Researchers have been investigating the efficacy of reversibly blocking the vagal nerve as a possible alternative treatment, as this treatment would be less invasive and come with fewer risks than bariatric surgery. The intra-abdominal vagal nerve controls metabolism, GI tract function and feelings of hunger. Previous studies have shown that temporarily blocking this nerve can result in substantial weight loss, but the results have only been significant in patients who had received the treatment for at least 12 hours a day.
239 people from Australia and the United States participated in this study. Participants were eligible if they could be classified as obese by their BMI and if they had an obesity-linked condition such as hypertension, type II diabetes, or sleep apnea. Researchers used an implanted device to block the vagal nerve. Two thirds of participants received an active device while the remaining third had a sham device implanted in order to act as a control group. For the experimental group, devices blocked the vagal nerve for 12 hours daily over a period of one year with occasional follow-up visits to monitor experiment progress and patient safety. For the sake of consistency within results, researchers did not encourage or prescribe any diet or exercise regimen for the participants.
Overall, 24.4% of participants with active devices lost excess weight while only 15.9% of sham participants (the control group) lost excess weight, a statistically significant result. Despite this difference, the weight lost by experimental participants did not meet the weight loss objectives set by the researchers to measure treatment efficacy. Researchers had hoped for at least a 10% difference between treatments, but results showed only an 8.5% difference. Therefore, blockage of the vagal nerve, while shown to result in some patient weight loss, was not deemed effective enough to be a treatment option under these experimental circumstances. 

JAMA. 2014; 312(9): 915-922


Caroline Russell-Troutman is the 2014-2015 Research Highlights Editor

Monday, April 14, 2014

Research Highlights: Radiofrequency Ablation Associated with a Decreased Rate of Cancer Development from Barrett Esophagus with Low-Grade Dysplasia

Radiofrequency Ablation Associated with a Decreased Rate of Cancer Development from Barrett Esophagus with Low-Grade Dysplasia

In western nations, esophageal cancer is becoming an increasingly prevalent threat.  Barrett esophagus - a disorder affecting the esophageal lining – is associated with an increased risk of esophageal cancer when it is accompanied by low-grade esophageal dysplasia. It is thus important to stop this disorder from progressing into a cancerous state. A study recently published in the Journal of the American Medical Association (JAMA) has investigated whether radiofrequency ablation could be an effective treatment for individuals suffering from Barrett esophagus with low-grade dysplasia, and prevent the development of more serious dysplasia or of esophageal cancer.
Patients at nine different European Barrett treatment centers participated in this study. Experimenters only deemed patients eligible if they had received an endoscopy revealing a clear case of Barrett esophagus with low-grade dysplasia, had a life expectancy of greater than two years, and were between the ages of 18 and 85. After screening for eligibility, experimenters randomly assigned participants to either an experimental or control group, with the experimental group receiving a radiofrequency ablation treatment and the control group receiving endoscopic surveillance. This experiment was double blind, so experimenters and patients alike were unaware of each patient’s group assignment.
Patients in the experimental group received treatment every three months from either a circumferential or focal ablation device, depending on the nature and severity of their disorder. This treatment continued until the patient’s incidence Barrett esophagus had been eliminated, or until a certain maximum number of ablation session had been reached. Experimenters then performed a follow-up endoscopy three months later to observe any changes in the patient’s health. Experimenters continued to perform regular follow-up endoscopies until three years after the initial ablation. Patients in the control group received high-resolution endoscopies regularly for three years after the initial group randomization but did not undergo any ablation. During endoscopies, experimenters checked primarily for incidence of high-grade dysplasia or cancer.
Results showed that patients treated with radiofrequency ablation experienced an overall 7.4% decrease in risk of progression to esophageal cancer compared to patients receiving no ablation treatment. Patients in the experimental group also displayed greater incidence of complete erasure of dysplasia. Overall, experimenters observed that radiofrequency ablation was associated with a significantly reduced risk of high-grade dysplasia and cancer development in patients afflicted with Barrett esophagus with low-grade dysplasia. Experimenters recommended ablation therapy as an effective treatment for this disorder.

Caroline Russell-Troutman is the 2013-2014 Research Highlights Editor
JAMA. 2014;311(12):1209-1217. doi:10.1001

Monday, March 24, 2014

Research Highlights: BGS Traps Could Help Control the Spread of Dengue Fever

BGS Traps Could Help Control the Spread of Dengue Fever

Mosquitos have long been considered a dangerous disease vector in certain parts of the world, spreading not only malaria but other potentially fatal diseases such as dengue fever. Female Aedes aegypti mosquitos transmit this disease to as many as 100 million people every year and about 2.5 billion people currently live in areas with a high risk of dengue infection. There is no vaccine available for this disease, so health professionals must turn to mosquito control in dengue-endemic areas in order to limit infection rates. One proposed method of control is the use of BG-sentinel (BGS) traps: mass-trapping devices that lure in mosquitos and kill them. 
A study recently published in the Journal of Medical Entomology has examined the efficacy of BGS traps in controlling mosquito populations. The study took place in the Cidade Nova neighborhood of Manaus, Brazil, a dengue-endemic area, and approximately 121,135 local residents participated. The study period lasted from February 2009 until June 2010. Experimenters observed twelve clusters of households, each consisting 103-151 households, where six clusters were the experimental group and the other six clusters were the control group. Experimenters installed BGS traps in households that were part of the experimental clusters while the control cluster households received no mass-trapping devices. Experimenters also set up “monitoring” traps in all households to catch mosquitos that were not stopped by BGS traps and thus determine the efficacy of the BGS traps. Experimenters collected these traps biweekly, then examined and counted the trapped mosquitos.
All households also completed questionnaires regarding number of household members, ages of household members, education, neighborhood familiarity and solidarity, current application of mosquito control measures, and other factors that could affect the differences between the control and experimental groups, and thus affect the accuracy of the study results. Though these questionnaires showed statistically significant differences in neighborhood solidarity and familiarity between the two groups, all other variables were similar between the groups.
The results of this study showed that households using BGS traps had reduced numbers of female Aedes aegypti mosquitos present in their homes during rainy months, but there was no difference in mosquito populations during dry months. Though fewer cases of dengue infection were reported among households using BGS traps than the control households, this difference was not statistically significant. Therefore, though this study showed BGS traps to be a potentially effective solution for mosquito control under some circumstances, these results do not provide evidence for the efficacy of BGS traps on a larger scale. More research is thus needed to fully understand the role of newer mass-trapping techniques in mosquito control. 

J. Med. Entomol. 2014; 51(2): 408-420



Caroline Russell-Troutman is the 2013-2014 Research Highlights Editor

Sunday, March 2, 2014

Research Highlights: London Public Bike-Sharing Programs May Improve Some Aspects of Citizen Health

London Public Bike-Sharing Programs May Improve Some Aspects of Citizen Health

In an era of sedentary lifestyles and environmental pollution, the idea of a “healthy city” that promotes walking and cycling in lieu of motorized transport has become increasingly popular. In 2010, the government of London, UK introduced a public bike sharing system that allows citizens to rent bikes from various stations located all over the city. A study published in the British Medical Journal examined whether this new system actually improves the health of citizens and makes London a “healthier city”. 578,607 London bike users over the age of 14 participated in this study by renting bikes between April 2011 and March 2012. Experimenters set up two scenarios: one in which the bike-sharing system existed and one in which it did not, and then modeled changes in user physical activity, intake of PM2.5 air pollutants, and incidence of road accidents in each scenario as ways of comparing user health.

Physical activity was measured in metabolic equivalent of tasks (MET). After applying statistical analyses, the results of this study showed that MET improved by 0.06 per week per person when citizens used the bike sharing system. This is not very significant on an individual scale, but would lead to a huge improvement in citywide health if applied to all citizens. The largest health benefit for men was reduced risk for heart disease, while for women it was reduced risk for clinical depression.

Inhalation of air pollutants did not improve much with bike usage: although bike users were more removed from sites of high air pollutant concentration (i.e. busy roads and London Underground tunnels), bike users experienced more frequent breathing as a result of their exercise and thus inhaled a similar level of air pollutants to non-cyclists. Results also showed that road accident rates decreased for bike users, though this was only shown to be statistically significant for small injuries. In general, experimenters found that health benefits were much more prominent in male participants and in older participants.

Experimenters tested how results could have been affected by multiple confounding parameters in order to identify the weaknesses of this study. One prominent weakness was that the road accident component of this study relied on police injury reports, not hospital records, to estimate a bike user’s risk of injury. This means that experimenters likely overestimated the risk level of using public bikes. Another weakness of this study was that experimenters were unable to measure baseline health and physical activity of the participants, so the estimated health benefits of public bikes may be inaccurate. However, none of these weaknesses drastically affected the accuracy of this study. Overall, the study relied on an effective model that, despite some limitations, showed bike-sharing systems to benefit the physical activity and health of bike users in some ways. More research is needed to determine the full health benefits of bike sharing programs, as this study was specific to London and applied only in the short to medium term.

BMJ 2014;348:g425


Caroline Russell-Troutman is the 2013-2014 Research Highlights Editor




Saturday, February 22, 2014

Research Highlights: Exposure to Air Pollution May Increase Risk of Coronary Events

Exposure to Air Pollution May Increase Risk of Coronary Events

A study published in the British Medical Journal (BMJ) has found strong associations between acute coronary events and exposure to air pollution. Air pollution has become an increasingly serious problem in the world and may cause millions of deaths annually. Recently, many studies have suggested that air pollutants could be linked to heart failure and other cardiovascular-related deaths. The BMJ study furthered this research by investigating the relationship between air pollution exposure and incidence of acute coronary events in several European countries.
11 cohort studies conducted in Finland, Sweden, Germany, Denmark and Italy made up this entire study. In total, 100,166 people with no previous coronary events participated. The study enlisted participants from 1997 until 2007 with the mean participant follow-up time being 11.5 years. Experimenters recorded air pollution exposure in the five European countries by using a series of filters to measure soot and black carbon levels in the air, and by measuring nitrogen oxide levels. Experimenters then used participants’ hospital records to check the amount of coronary events participants experienced throughout the study. Potential confounding variables such as the marital status, education, lifestyle, physical activity levels, and occupation of the participants were also recorded. Finally, the experimenters used statistical tests to determine the association between air pollution exposure and coronary events in participants.
Of the 100,166 people enrolled in the study, 5,157 experienced coronary events. Results showed that increases in levels of certain types of air pollutants were strongly associated with a 12-13% increased risk of coronary event incidence. Experimenters found other smaller associations between air pollutants and coronary health but deemed these statistically insignificant. Therefore, the results of this study strongly suggest that exposure to air pollution increases an individual’s risk of experiencing a coronary event.
Though the results of this study agree with past studies, as well as cohort studies conducted in the United States, researchers noted that some of the study’s results could have been caused by factors other than air pollution. For example, cohort studies with younger participants and higher rates of smoking among participants showed air pollution to have a smaller effect on coronary health. Another weakness of the study was that experimenters were only able to examine variation in air pollution levels within a cohort study (i.e. within one area of one country). They were not able to examine differences in air pollution between different cohort studies in different countries. Overall, however, this study provides new information about the serious effects of air pollution on individual health and demonstrates the urgency of limiting air pollution in the modern world.
BMJ 2014;348:f7412

Caroline Russell-Troutman is the 2013-2014 Research Highlights Editor

Saturday, February 8, 2014

Research Highlights: Upper-Airway Stimulation Devices May Be an Effective Treatment for Obstructive Sleep Apnea

Upper-Airway Stimulation Devices May Be an Effective Treatment for Obstructive Sleep Apnea

The New England Journal of Medicine (NEJM) recently published a study investigating the efficacy of upper-airway stimulation as a treatment for obstructive sleep apnea. Obstructive sleep apnea is a disorder in which one’s upper airways are narrowed or closed during sleep. It is common, often resulting in extreme fatigue and reduced quality of life. Moderate to severe obstructive sleep apnea may have heightened risks, including vascular disease and even death. A popular current treatment for obstructive sleep apnea is the CPAP mask, which can effectively combat airway closures at night but is entirely reliant on patient adherence to treatment. Since many individuals struggle to maintain their treatments themselves, it is important to develop new treatments, such as upper-airway stimulation, that do not rely on patient adherence
126 individuals with moderate to severe sleep apnea who struggled to adhere to the CPAP treatment participated in this study. Experimenters excluded individuals with sever neuromuscular, pulmonary or heart diseases as well as those with other sleep conditions unrelated to sleep apnea. Experimenters surgically implanted an upper-airway stimulation device into the participants and then monitored how many episodes of sleep apnea the participants experienced each night over a period of 12 months.
Though this study was conducted with many safety precautions, two participants experienced adverse effects due to complications involving the surgically implanted device. Some other participants suffered from minor side effects such as tongue weakness, sore throat, and pain at the incision site, but these adverse effects were not considered serious and were generally resolved by making small adjustments to the implanted devices.
Despite these surgical complications, the implanted device showed evidence of being an effective treatment.  After 12 months, researchers found that the participants had experienced 68-70% fewer sleep apnea episodes per hour. Participants also rated significantly better on the FOSQ and Epworth Sleepiness Scale, indicating that they were far less fatigued than they had been prior to the treatment. In an additional withdrawal trial, 23 randomly selected participants had their devices switched off for one week and experienced a sharp increase in episodes of sleep apnea, strongly suggesting that the upper-airway stimulation device was the direct cause of their prior decrease in sleep apnea episodes. Though more experiments are needed to determine the safety and efficacy of this device, upper-airway stimulation has thus far shown evidence of being a powerful new treatment for obstructive sleep apnea, especially for those who struggle to adhere to the CPAP treatment.

NEJM 2014; 370:139-149


 

Caroline Russell-Troutman is the 2013-2014 Research Highlights Editor

Monday, February 3, 2014

Research Highlights: Coronary Artery Calcium Volume May Be Early Predictor of Heart Disease

Coronary Artery Calcium Volume May Be Early Predictor of Heart Disease

A study recently published in the Journal of the American Medical Association (JAMA) has suggested that individuals with large amounts of coronary artery calcium are at a greater risk for cardiovascular disease events such as stroke and cardiac arrest. Coronary artery calcium (CAC) has long been considered a predictor for cardiovascular disease events, but very little research has yet explored the particular measure of CAC that is the most effective predictor. This study looked specifically at both CAC volume and density and examined the association between these factors and cardiovascular disease.
Experimenters measured the CAC volume of a group of men and women between the ages of 45 and 84 with no history of heart disease or events. Of this group, 3398 individuals had CAC levels greater than zero and these individuals went on to participate in the study. Experimenters also tested participants for cardiovascular risk factors. The study began with these tests in 2000-2002 at six different field centers across the United States and a follow-up continued until 2008-2010 with the mean follow-up time being 7.6 years. Experimenters used electron-beam CT scanners to measure starting CAC volume and density in participants.
Overall, 265 cardiovascular disease events occurred during this study. Experimenters found that the higher an individual’s CAC volume was, the more likely they were to experience cardiovascular disease events. Thus, this study showed a strong association between CAC volume and cardiovascular disease. Experimenters noted that measuring CAC volume could be an excellent predictor for cardiovascular disease in people who are at an intermediate risk level.
Unlike participants with high CAC volume, individuals with high CAC density did not experience more events than those with low CAC density. In fact, experimenters measured an inverse association between cardiovascular disease events and CAC density, which is in accordance with other current research. Based on these results, experimenters advised that CAC density should be taken into account when testing patients for cardiovascular disease risk, not simply CAC volume. This research is thus important, as it not only supports CAC volume as a predictor of heart disease, but also reveals how measurements of cardiovascular disease risk may be inaccurate if CAC density is not also considered.

JAMA. 2014;311(3):271-278


 

Caroline Russell-Troutman is the 2013-2014 Research Highlights Editor

Saturday, December 7, 2013

Research Highlights: Potential DNA/rAd5 HIV-1 Vaccine Shown to Be Ineffective in Recent Trial

Potential DNA/rAd5 HIV-1 Vaccine Shown to Be Ineffective in Recent Trial

A study published in the New England Journal of Medicine investigated the efficacy of a new vaccine to prevent HIV-1 infection. The participants in this study were 2,504 men or transgender women who engage in frequent unprotected sex with men, as this demographic has a high risk of contracting HIV-1 within the United States. Experimenters randomly treated participants with either a placebo or the DNA/rAd5 vaccine. This treatment was double-blind. From the 28th week of treatment until the end of the second year of treatment, experimenters checked participants for HIV-1 infection. If participants were diagnosed with the disease during this time then experimenters monitored their viral-load set point (the amount of HIV-1 RNA in their plasma) to observe the progression of HIV-1 infection. 
The vaccine was a DNA/rAd5 regimen composed of two parts: vaccinated participants received three 4-mg injections of the DNA component, then later received one injection of the rAd5 component as a boost.
During the monitoring period of the 28th week until the second year of treatment (referred to as Week 28+), 27 participants receiving vaccinations and 21 participants receiving placebos were diagnosed with HIV-1. The mean viral-load set point for vaccinated participants was 4.46 log10 copies per milliliter, and the mean viral-load set point for placebo-group participants was 4.47 log10 copies per milliliter. The two groups thus had very similar plasma levels of HIV-1 RNA. The vaccine, therefore, did not help to prevent HIV-1 infection and did not reduce viral-load set points in infected patients.
Given the prevalence of HIV/AIDS infection globally, the search for an effective vaccine is a popular research issue. This experiment is the sixth efficacy trial of an HIV-1 preventative vaccine to date, yet almost all of these trial vaccines have been unsuccessful in inhibiting HIV-1 infection. Though no effective treatment has yet been developed, even unsuccessful attempts, such as the vaccine tested in this study, lend important contributions to study of HIV-1 and to the eventual establishment of a working vaccine.

NEJM 2013; 369:2083-2092


Caroline Russell-Troutman is the 2013-2014 Research Highlights Editor

Sunday, December 1, 2013

Research Highlights: Thalidomide May Increase Clinical Remission Rates in Children with Pediatric-onset Crohn Disease

Thalidomide May Increase Clinical Remission Rates in Children with Pediatric-onset Crohn Disease

The Journal of the American Medical Association (JAMA) recently featured a study indicating that the drug thalidomide may be an effective treatment for pediatric-onset Crohn Disease. Currently, pediatric-onset Crohn Disease is far more difficult to treat than adult-onset Crohn Disease, as it is more aggressive and is resistant to most preventative drugs. This lack of effective treatment can irreparably harm children afflicted with the disease. This study was thus an important preliminary step in finding new treatment options.
The study looked at 56 children who had been diagnosed with active Crohn disease and whose conditions had not improved after previous treatments. These children were located at six different pediatric clinics in Italy. Investigators randomly assigned children to either a thalidomide treatment or a placebo in a double-blind trial. During the first eight weeks of the trial, 46.4% of the children receiving thalidomide showed clinical remission of Crohn disease, while only 11.5% of children receiving a placebo showed any clinical remission. Some of the children in the placebo group who were not in remission received thalidomide later in the study and 52.4% of this new group also exhibited remission of the disease. In total, experimenters saw clinical remission in 63.3% of the children treated with thalidomide, which was far greater than the initial remission rate (11.5%) of the placebo group.
Furthermore, follow-up studies revealed that children receiving thalidomide experienced much longer periods of clinical remission than the children taking placebos. For children in the thalidomide group, the average remission time was 181.1 weeks, while the average remission time was only 6.3 weeks for children in the placebo group.
Overall, treatment with thalidomide resulted in much greater remission rates of Crohn Disease as well as substantially longer periods of remission. The results of this study thus strongly suggest that thalidomide could be an effective treatment for the disease. Further trials are needed to determine the true efficacy of the drug in treating Pediatric–onset Crohn Disease. For now, however, thalidomide is a promising potential remedy for a disease with few currently effective treatments.

JAMA. 2013;310(20):2164-2173


Caroline Russell-Troutman is the 2013-2014 Research Highlights Editor

Saturday, November 16, 2013

Research Highlights: Compilation of Studies Suggests that Bariatric Surgeries May Be More Effective at Treating Obesity than Non-Surgical Treatment Options

Compilation of Studies Suggests that Bariatric Surgeries May Be More Effective at Treating Obesity than Non-Surgical Treatment Options

A recent article published in the British Medical Journal indicated that bariatric surgeries to treat obesity may be more effective than non-surgical obesity treatments. Bariatric surgery is defined as a surgery that reduces the size of the stomach to promote patient weight loss. Non-surgical treatments include diet changes, increased exercise, pharmacotherapy, and general lifestyle alterations. The results of this research were based on a series of different randomized studies conducted on patients with a body mass index of 30 or greater. The patients received either a bariatric surgery or a non-surgical therapy to treat their obesity and investigators recorded body weight, waist circumference, blood pressure, glucose levels, and several other criteria to determine the relative effectiveness of either treatment.
In all studies, patients undergoing bariatric surgery lost more weight than patients treated with non-surgical options with the mean weight loss difference between the two treatments being 26 kilograms. The circumference of patients’ waists also decreased far more when patients had received bariatric surgery, with a mean waist circumference difference of 16 centimeters between the two treatment options. There was no statistically significant difference between changes in blood pressure or changes in total cholesterol levels for either treatment option. However, patients undergoing bariatric surgeries also showed a greater overall decrease in triglyceride and glucose levels, as well as higher remission rates for type 2 diabetes and metabolic syndrome in some studies. 
There is some uncertainty in these results as investigators only looked at 11 different studies to compile this research and these studies were relatively small with up to only a 2-year follow-up. Investigators consider these factors to be limitations for this research. Furthermore, each study was conducted under different conditions. Five studies focused specifically on patients with type 2 diabetes, three studies contained only patients who had previously tried to lose weight, and one study only focused on patients with obstructive sleep apnea. However, investigators claim that the variety of study conditions is a strong point of this research since it shows that bariatric surgery was consistently more effective than non-surgical treatments, even across a wide range of subgroups and study conditions. This research can thus be considered a comprehensive comparison of bariatric surgery and non-surgical treatments, though more studies must be conducted in order to further prove the greater effectiveness of bariatric surgery.
BMJ. 2013. 347: f5934.



Caroline Russell-Troutman is the 2013-2014 Research Highlights Editor
.

Thursday, November 7, 2013

Research Highlights: Dietary Iron Supplements May Not Increase Risk of Malaria in Ghanaian Children

Dietary Iron Supplements May Not Increase Risk of Malaria in Ghanaian Children

A study suggesting that iron supplements do not increase malaria incidences in Ghanaian children was recently published in the Journal of the American Medical Association using data collected in 2010. Previous studies have suggested that children living in areas with high rates of malaria are more susceptible to the disease if they take iron supplements. The recent study tested this hypothesis by observing incidences of malaria in two randomly assigned groups of children – one receiving MNP (micronutrient powder) containing iron supplements and one receiving MNP without iron supplements.
1,958 children aged 6 to 35 months living in central Ghana participated in the study. Only children who had not recently taken iron supplements, who did not have a chronic illness, and who were not severely anemic participated. All children received bed-nets treated with insecticide to protect against mosquito bites and were given MNP for five months, then were further observed for one month. Children who became feverish were tested and treated for malaria.
The data collected in this study initially showed lower rates of malaria in children taking iron supplements, but the differences in malaria rates for the two groups were shown to be insignificant after the data was adjusted for error. The results thus showed no clear correlation between taking iron supplements and risk of malaria in Ghanaian children. Researchers added that this study only considered cases where malaria treatment and prevention were readily available, so these results may not be accurate in other cases.
This study is important for countries like Ghana where malaria and iron deficiencies are both prevalent. The practice of providing iron supplements to counter anemia has been limited in Ghana due to the worry that these supplements will increase incidences of malaria, but the results of this study show that this worry may be unfounded under circumstances where effective malaria treatment is available. More research must be conducted to refute any link between iron supplements and malaria risk. For now the results of this study support new opinions about heath policy in countries like Ghana, as the World Health Organization recently recommended that iron supplements be provided in regions where malaria prevention and treatment are implemented. 

JAMA. 2013. 310(9): 938-947

Caroline Russell-Troutman is the 2013-2014 Research Highlights Editor

Saturday, March 2, 2013

Research Highlights: Preliminary Study Shows Increased Rates of Experimentally Induced Viral Infections Associated with Short Telomere Length

Preliminary Study Shows Increased Rates of Experimentally Induced Viral Infections Associated with Shorter Telomere Length in Healthy Adult Populations

By: Joseph St. Pierre

According to an article recently published by the Journal of the American Medical Association, shorter telomere length in healthy immune-response cells was found was found to be associated with higher rates of  upper respitory infection via experimentally introduced doses of the common cold virus, rhinovirus type 39 (RV39). Telomeres, structures of DNA and protein capping the ends of each chromosome in humans, shorten with each subsequent cell division, ultimately limiting cell growth and metabolism. Consistently found to be associated with the onset of age-related morbidity, telomere length has long been considered to be a large contributor to functional issues in aging populations. However, at the time of this article's release, no other research examining the effect of decreased telomere length in younger, healthy adult populations had been published.

The study enrolled 152 participants aged between 18 and 55 years. Blood was drawn for telomere length assessment.  Subsequently, all patients received doses of RV39 via nasal drips and were quarantined for six days during which nasal lavage was collected and evaluations for signs of illness were performed. Blood samples were collected 28 days following exposure to gauge antibody response to RV39.   Finally, the association between telomere length of various peripheral blood mononuclear cells (T-cell subsets CD4, CD8CD28+, and CD8CD28-) and the rate of infection and clinical illness was examined via statistical analysis. Overall, shorter telomere length was found to be associated with increased infection rate. However, telomere length was shown to have an association with the rate of clinical illness in only one of the various cell types studied (CD8CD28-).

It should be noted that CD8CD28- cells lack the CD28 protein, long shown to be associated with the regulation and maintenance of telomeres in T-cell populations, rendering the aforementioned point to be of particular interest. However, due to a small study population and the preliminary nature of the study, more research must be conducted to further examine the discussed phenomena.

JAMA. 2013;309(7):699-705

Joseph St. Pierre is the 2012-2013 Tuftscope Research Highlights Editor

Monday, February 25, 2013

Research Highlights: New Model for Lung Cancer Screening More Effective, Study Shows

New Model for Lung Cancer Screening Selection Criteria Shown to Be More Effective

by: Joseph St. Pierre


Recommending lung cancer screening on patients deemed to be high-risk according to the current model developed by National Lung Cancer Trial (NLCT) has been shown to lower the overall mortality rate of the disease by 20%.The applied criteria include being between 55 and 74 years of age, having a history of smoking of at least 30 pack-years, a period of less than 15 years since cessation of smoking, or a modified set of requirements based on the former criteria. However, according to an article published by the New England Journal of Medecine, researchers have developed a modified model based off of risk factors used in the Prostate, Lung, Colorectal and Ovarian (PLCO) Cancer Screening Trial. The new model, which analyzes additional risk-based factors like BMI, family history, and smoking status, has be shown to be significantly more efficient at marking patients as high-risk and increasing diagnosis yield.


The study was performed by analyzing the intervention and control group data of the two aforementioned studies, NLCT and PLCO, which encompassed 53,202 and 80,375 smokers, respectively. An additional group composed of 15,099 PLCO intervention group members who met the NLST criteria was also included. Statistical analysis compared the effectiveness of the new, modified PLCO risk-based model, named PLCOM2012 , with the unmodified NLCT criteria. Compared to NLCT criteria, PLCOM2012 had significantly increased sensitivity and positive predictive value, allowing it to miss 41.3% less lung cancer diagnoses.


While this data may imply that PLCOM2012  is an obvious improvement over current lung cancer screening methods, it should be noted that PLCOM2012 is far more difficult to apply than its predecessor, as it hinges on complex modeling and the use of multivariate statistics. Furthermore, PLCOM2012 occasionally uses parameters, such as follow-up time, that differ from its older counterpart, rendering certain comparisons inaccurate. However, in dealing with an affliction whose early detection is key to successful treatment, such findings and research are, at the very least, yet another step towards more a more efficient means of recognizing at-risk individuals.

NEJM. 2013. 368:728-736

Joseph St. Pierre is the 2012-2013 Tuftscope Research Highlights Editor


Tuesday, February 5, 2013

Research Highlights: Inclusion of Antibiotics in Nutritional Therapy for Children Afflicted with Acute Malnutrition Associated with Improved Recovery and Decreased Mortality Rates

Inclusion of Antibiotics in Nutritional Therapy for Children Afflicted with Acute Malnutrition Associated with Improved Recovery and Decreased Mortality Rates

By: Joseph St. Pierre

The New England Journal of Medicine recently published an article detailing the association of antibiotic regimen implementation in nutritional therapies for malnourished children of 6 to 59 years of age with improved recovery time, weight gain, and reduced mortality rates. In the study described, Pediatric researchers found that acutely malnourished children whose nutritional therapies were supplemented with a regimen of either of the antibiotics amoxicillin or cefdinir experienced significantly lower rates of treatment failure, as well as significantly lower mortality rates, over children whose nutritional therapies were supplemented with only a placebo.

The study saw the participation of 2767 malnourished children spread across 18 feeding centers throughout Malawi. Children taking part in the study were stated to possess similiar baseline characteristics and were subject to outpatient care. Participants were randomly assigned to one of three groups, each of which were supplemented the usual routine of counseling and a daily dose of RUTF (Ready-to-Use Therapeutic Food) with either amoxicillin, the cefdinir, or a placebo during the first seven days of therapy. Over the course of the study, recovery rate, mortality rates, and weight and length gain were documented and compared amongst groups via stastical analysis. Children receiving the placebo suffered significantly higher occurances of treatment failure and death. Furthermore, the groups receiving amoxicillin or cefdinir experienced significantly shorter recovery times than those observed in the placebo group, with the amoxicillin group exhibiting the stastically shortest recovery times of the three.

Researchers admit that there are factors that may limit the applicability of the data collected. For example, the study was performed in Malawi, where HIV is a prevalent affliction. Only 31.6% of the study's participants were tested for HIV, and those confirmed to have disease were described to be at most risk for treatment failure. Furthermore, factors like age also had a significant effect on treatment outcome, with younger participants exhibiting increased rates of treatment failure. However, the data does support the possibility that even when subject to nutritional therapy, malnourished children possess increased vulnerability to bacterial infection, and that more research should be performed to assess whether supplementing RUTF nutritional therapy with an antibiotic regimen could be of significant benefit to high-risk populations.

NEJM. 2013. 368: 425-435.

Joseph St. Pierre is an affiliated staff writer for Tuftscope (2012-2013)

Sunday, October 14, 2012

Research Highlights: Depression May Be Related to Survival of Patients with Renal Cell Carcinoma

Depression May Be Related to Survival of Patients with Renal Cell Carcinoma
Reviewed by: Ariel Lefland

A recent study published by PLOS ONE identified symptoms of depression and dysregulation of cortisol as key factors in predicting survival in renal cell carcinoma (RCC) patients. The study provides the first evidence of this association, controlling for disease- and treatment-related factors such as age, sex, ethnicity and disease risk index.

In a prospective study, researchers followed 217 patients with RCC, a life expectancy greater than four months, a Zubrod performance status of less than or equal to 2 and no other serious illnesses. Patients gave blood and saliva samples and completed several psychosocial questionnaires (i.e., the Centers for Epidemiologic Studies - Depression, or CES-D). Whole-genome transcriptional profiling was performed on samples from 15 patients with the highest levels of depressive symptoms. Transcript analyses indicated an up-regulation of genes involved in inflammation, immune response and negative regulation of programmed cell death and a down-regulation of genes involved in cell trafficking, adhesion, oxygen transport and hemostatis. in patients with high CES-D scores, 116 transcripts were found to be up-regulated by an average of 50% or more, and 57 transcripts were down-regulated by 50% or more. Furthermore, individuals with high CES-D scores had significantly greater tumor-associated macrophages compared to patients with low CES-D scores. Researchers also found that cortisol regulation may play a role in the correlation between symptoms of depression and cancer progression.

While this study cannot show that depressive symptoms lead to the progression of RCC, it demonstrates an important connection between psychological wellbeing and disease prognosis. More research must be conducted to determine whether depressive symptoms decrease survival rates or if RCC, in fact, causes these depressive symptoms.

PLoS ONE 7(8): e42324. 

Ariel Lefland is the 2012-2013 Research Highlights Editor

Sunday, April 25, 2010

Research Highlights: Patients Starting Anticonvulsant Drugs Have an Increased Risk of Suicide

Patients Starting Anticonvulsant Drugs Have an Increased Risk of Suicide
By: Caroline Melhado

The Journal of the American Medical Association published an article on the increased rate of suicides among patients who are prescribed several forms of anti-convulsants. After a small meta-analysis performed by the FDA that resulted in the warning label on anti-convulsants for increased suicidal thoughts, researchers desired to perform a larger cohort study that might confirm the FDA’s findings. Researchers found that patients newly prescribed gabapentin, oxcarbazepine, lamotrigine and tiagabine compared with topiramate had a significantly increased risk of suicide.

The study cohort was composed of 297,620 individuals who had started anticonvulsant drugs at the start of the study, had not previously been on any anticonvulsant and had no history of suicide. Participants were followed for 180 days after their initial prescription fill. The study observed 827 suicidal acts and 41 violetn deaths during the study. They found that individuals on gabapentin had 5.6/1000 individuals extra cases of suicidal incidents. Patients on oxcarbazepine had a 10.0/1000 population excess, and tiagabine had a 14.1 /1000 excess compared with topiramate. These increases remained even after taking into account confounding factors such as age and patients with mood disorders.

The cause of these findings is still unknown, however anticonvulsant drugs have a plethora of psychotropic effects; that are useful in their prescription for numerous diseases. This study has a variety of limitations including residual confounding and misclassification of suicidal attempts. However, because this study is the first of its kind and size it will be instrumental to physicians looking to prescribe first line anticonvulsant to a wide range of patients.


JAMA. 2010;303(14):1401-1409.

Sunday, April 18, 2010

Research Highlights: Study Finds Vitamins not Effective in Lowering Pregnancy Related Hypertension

Study Finds Vitamins not Effective in Lowering Pregnancy Related Hypertension
By: Caroline Melhado

The New England Journal of Medicine published a study investigating the effect of supplemental vitamins in pregnant women who were at risk of hypertension. Researchers performed a double-blind, randomized test to test the efficacy of supplemental vitamin C and E in preventing pregnancy-associated hypertension. They found that the supplements did not significantly decrease the risk of hypertension or negative perinatal/maternal outcomes.

The outcome data was calculated from 9,969 women who did not have previous symptoms of hypertension and had not past their 16th week in gestation. The women were randomly assigned to receive 1000 mg of Vitamin C and 400 IU of Vitamin E or a placebo. The outcomes investigated were pregnancy related hypertension, eclamptic seizure, preterm birth and maternal or perinatal death. Results confirmed that vitamin C and E supplements did not significantly reduce the risk of hypertension, for the relative risk of the vitamin group was 1.07.

While some other studies had previously suggested the role of vitamin C and E in lowering rates of hypertension in pregnant women, a new slew of studies have not been able to reproduce this result. Most of the women in the trail were already taking prenatal vitamins, so many researchers suggested that the extra supplement was superfluous to already normal levels of vitamin C and E.

NEJM Volume 362:1282-1291

TuftScope: The Interdisciplinary Journal of Health, Ethics, and Policy

TuftScope is a student journal published biannually in conjunction with Tufts University since 2001. Funding is provided by the Tufts Community Union Senate. The opinions expressed on this weblog are solely those of the authors. The staff reserves the right to edit blog postings for clarity and to remove nonfunctional links.

  © Free Blogger Templates Autumn Leaves by Ourblogtemplates.com 2008

Back to TOP