Using Meetings At Hospitals

meetings in a hospitalSmall team meetings are a great way to both improve communication between staffers and set goals to deal with problems at the workplace, whether it’s in a hospital or a more traditional office.  For example, the Boston Medical Center internist teams meet every Friday morning to talk about issues specific to their groups, and once a month all of the teams come together for a larger meeting for big announcements and to celebrate successful efforts.  Such meetings help break down barriers and make improvement projects more effective by giving different team members a stronger voice in the decision-making process.  

In addition, meetings offer all team members a chance to show why certain changes are necessary; for example, the physician might not know why the front desk does things a certain way, and vice versa.  Team meetings are a great way to clear up any “mystery” with that.  Poor communication is a major problem in a hospital workplace, and practice managers sometimes offer directives that are unclear, unspecific or don’t properly convey the urgency.  On average, ineffective communication takes up 40 minutes of an employee’s day, costing about $5,200 a year for each staff member at a large hospital.  

The American Medical Association has a “STEPS Forward module”, which offers several ways for practice managers to effectively structure and schedule meetings.  These include scheduling meetings when patient care is least likely to conflict, limiting group size to make sure everybody has a voice, keeping meetings focused on the issues at hand and sticking to a consistent agenda.  For projects coming out of practice meetings, assign a point person who will coordinate the efforts and then report back to the group later.  But also make sure that you conduct regular follow-ups on goals and issues discussed during these meetings.  

If you’d like to learn more, you can click here!

New Pathology Center at Mount Sinai

Mount Sinai medical centerThe department of pathology at the Icahn School of Medicine at Mount Sinai has just established the Center for Computational and Systems Pathology, with the goal of using advanced computer science, mathematical techniques, cutting-edge microscope technology and AI to revolutionize pathology practice.  This new facility will explore efforts to more accurately classify diseases and guide treatment with computer vision and machine learning techniques.  It will also serve as a hub for the development of new tests, partnering with Mount Sinai-based “Precise Medical Diagnostics” (Precise MD).  

The new center will be overseen by Carlos Cordon-Cardo, MD PhD, and will be continuing his role as chair of the department of pathology at the Mount Sinai Health System and professor at the Icahn School of Medicine.  Associate professor Gerardo Fernandez, MD will be the medical director, working closely with pathology research professor Michael Donovan, MD PhD and Jack Zeineh, MD, director of technology for Precise MD.  Precise MD is developing new approaches to characterizing an individual’s cancer through combining multiple data sources and then using mathematical algorithms to analyze them, offering a more sophisticated alternative to standard approaches.  

In its initial phase this summer, Precise MD will complete a test used for patients who have had prostatectomies at Mount Sinai Health System to determine which of them are more likely to have a recurrence of cancer and may need additional therapy.  The approach will give researchers an in-depth knowledge about the biological behavior about prostate cancer, which will allow them to choose the appropriate patients for active surveillance.  A second test will follow next year, which will be used to characterize prostate cancer in newly-diagnosed patients, by which time all prostate cancer patients at Mount Sinai will have the chance to receive this test.  

If you’d like to learn more, you can click here!

Beware the IDES

Beware the IDESResearchers at the Stanford University School of Medicine have devised a way to significantly raise the sensitivity of a technique to identify and then sequence DNA from cancer cells that circulate in a person’s blood, with the hope that such “liquid biopsies” of easily-obtained blood samples could eventually replace the need to surgically obtain tumor tissue for study.

This new approach works by identifying errors that occur when tumor DNA is captured from the blood and prepared for sequencing.  Removing these errors from the sequencing results will allow researchers to more accurately find true cancer-associated mutations from even miniscule amounts of starting material.  Researchers say that this means they’ll be able to detect the presence of specific mutations in the cancer DNA that could help to drive treatment choices or detect the presence of residual cancer, getting us closer to reducing the need for invasive biopsies to identify tumor mutations or track response to therapies.

Even without treatment, cancer cells are constantly dividing and dying, releasing DNA into the bloodstream.  Learning to identify and read these and pick out the one in 1,000 or one million that come from a cancer could help clinicians quickly and noninvasively monitor the presence and volume of a tumor, a patient’s response to therapy and even how the tumor mutations evolve over time in the face of treatment or other selective pressures.  Researchers termed their new, two-pronged approach “integrated digital error suppression”, or IDES.  This builds upon a method called CAPP-Seq that was previously designed to capture very small amounts of tumor DNA from the blood.

IDES builds upon CAPP-Seq by addressing an inherent technical limitation: an inability to accurately sequence very small quantities of DNA.  Before any sequencing can be attempted, many copies need to be made of each double-stranded DNA fragment, in a process known as amplification, during which time the chance of introducing an error in the sequence becomes higher and higher.  Researchers, therefore, needed a way to determine whether mutations identified during the sequencing process came from the tumor or were introduced during the sequencing process.  They used “bar codes” that uniquely mark each original molecule to tag circulating double-stranded DNA molecules.  Since the strands of original DNA fit together, it’s possible to predict the sequence of one strand from the other.  Bar codes therefore allowed the researchers to match up the two strands and work from there.

Since it eliminates more false positives without sacrificing true positives, the researchers say that their technique is a significant advance.  Tagging top DNA molecules allows them to keep track of which molecules have been faithfully reproduced during the sequencing process.  The bar-coding approach was then combined with a computer algorithm meant to scan data and flag any possible trouble spots.  This combination allowed the researchers to filter out common sequencing mistakes far more efficiently than either technique used on its own.

If you’d like to learn more, you can click here!

The Rise of Telemedicine

The Rise of TelemedicineA recent survey has found that telemedicine has continued to evolve into a mainstream technology service, with a growing number of healthcare professionals viewing it as a top priority.  The 2016 Telemedicine Industry Benchmark Survey by the telemedicine software company Reach Health surveyed some 390 healthcare professionals, including executives, physicians, nurses and other professionals, gathering input on their priorities, objectives and challenges, telemedicine program models and management structures, service lines and settings.  Reach Health compared these findings to the results of last year’s survey in an effort to better understand the trends and changes in telemedicine.

Among the survey’s key findings was that nearly two-thirds of its participants viewed telemedicine as a top priority, representing a 10 percent increase from last year.  When surveyed about top objectives for telemedicine programs, patient-oriented objectives occupy the top three positions for most common objectives.  When asked to rate their success in achieving telemedicine program objectives, respondents indicated a high degree of success with those same top three objectives.  Respondents said that their highest degree of success came with providing remote or rural patients with access to specialists.

The survey report also took a look at telemedicine attributes that are most highly correlated with success.  The survey report authors said that some attributes exhibit a strong correlation with success, such as the priority of the telemedicine program as ranked among other hospital priorities.  For example, telemedicine programs ranked as a top priority are 62 percent more likely to be highly successful thank those who ranked it as a low priority.  If these telemedicine programs do have a dedicated program manager or coordinator, then they’re 43 percent more likely to be highly successful than those with a less focused program manager.

The survey also addressed telemedicine program challenges and survey participants identified, ranking their challenges in terms of those that remain unaddressed, partially addressed, fully addressed or not a challenge.  Issues tied in with reimbursement and electronic medical record systems were listed as the main challenges to telemedicine.  Determining ROI was also acknowledged as a challenge, although the survey authors noted that improving financial return was least frequently cited as a top objective.  Although compensation for physicians remains relatively high on the list of challenges, physician acceptance has improved compared to last year, and is fairly low on the list of challenges.

As the telemedicine industry comes of age, hospitals and healthcare systems have been exhibiting a rising trend toward an enterprise approach, with larger systems moving more quickly in this direction than smaller hospitals.  If you’d like to learn more, you can click here!

Manipulating Pollen

Ragweed plant

Also known as Ambrosia, the ragweed plant is known for its aggressive pollen.

Recently, scientists at Helmholtz Zentrum in Munich have discovered that pollen of the common ragweed has higher concentrations of allergen when exposed to NO2 exhaust gases.  The study also indicates the presence of a possible new allergen in the plant.  Researchers of the Institute of Biochemical Plant Pathology (BIOP) studied how nitrogen oxides affect the pollen of the plant, specifically by fumigating the plants with various concentrations of NO2, which is generated during combustion processes of fuel.

The data from the study revealed that the stress on the plant caused by NO2 modulated the protein composition of the pollen, with different isoforms of the known allergen Amb a 1 being significantly elevated.  In addition, scientists observed that the pollen from NO2-treated plants have a significantly increased binding capacity to specific IgE antibodies of individuals who are allergic to ragweed, which frequently starts an allergic reaction in humans.  The plant researchers also identified a protein, not previously known to be an allergen in ragweed, that was present when NO2 levels were elevated.  It has a strong similarity with a protein form a rubber tree, in which context it was previously described as an allergen whose effect was also seen in fungi and other plants.

Due to air pollution, it is expected that the already aggressive ragweed pollen will become even more allergenic in the future.  Originating in the Americas, ragweed is believed to have come to Europe through imported birdseed, and due to climate change, it’s become widely dispersed across the continent.  Ragweed pollen is very aggressive, and since it doesn’t bloom until late summer, it lengthens the “season” for those who are allergic to it.  Studies have already shown that ragweed plants growing along highways are clearly more allergenic than those growing away from road traffic.  The researchers plan on doing further studies in the future, where they plan on showing that pollen only treated with NO2 can also elicit stronger in vivo reactions.

5 Steps To a “Learning Health System”

Michael weilert MD Amy Abernethy

Amy Abernethy

I recently came across an article by Dr. Amy Abernethy, who delivered the opening keynote address at the American Medical Informatics Association’s (AMIA) annual symposium.  Without good data to back it up, she says, patient-centeredness is nothing more than a buzzword.  And without a patient-centered focus and proper organization, data tends to be pretty useless.  Comparing the flood of data now available to the Amazon River, Abernethy declared the need for a vision for a way forward, called a learning health system.  This needs to be powered by a “river of data”, with a “North Star” to guide operations.  To Janet, this North Star is a 37 year-old patient with melanoma named Janet.

While speaking with Janet on the risks of interferon treatment, Abernethy pulled in records from various data streams that formed rivulets, which dumped into a large river of data.  Informaticians, as members of the AMIA call themselves, face two challenges: exploring all of this data now available to them and explaining it to practitioners, payers, administrators and patients.  According to Abernethy, the vernacular in informatics doesn’t align with that in other parts of healthcare, meaning that there needs to be a common language.  As healthcare professionals think about trying to bridge the communication gap, people find it safer to keep on their sides and not talk to each other, despite the fact that they could be learning so much.  Abernethy spoke of the need to get better at bringing data to standard clinical practice, and outlined five recommendations to do so:

1. Putting the patient at the center.  Patients are the anchors of Medical Informatics, and their stories help clinicians work better with data.

2. Having meaningful data by putting the information into the right context.  This information, says Abernethy, should be taken as more than just a snapshot in time, so that it can be repurposed for research.

3. Improve data quality by putting that data to use.  Informaticians need to use data to make sure that it’s accurate.  They care deeply about data’s quality, sanctity, security and validity.

4. Data needs to be trustworthy.  To get information from patients over time, there needs to be a system of trust, so that the data is valid.

5. Data must be interoperable.  Talking about Janet, Abernethy said that even if her disease does return, through her data, Informaticians can learn more about these diseases.  Abernethy noted that the Cancer Biomedical Informatics Grid (CaBIG) failed partially due to a breakdown in communications, since communicating the language and culture of informatics to clinical care at the time was nearly impossible.

Truman Medical Centers

Recently, the Kansas City-based Truman Medical Centers (TMC) has won two separate awards from the health IT community for its use of IT to improve health care.  Last month, they were recognized by the College of Healthcare Information Management Executives (CHIME) as the winners of the 2014 CHIME-AHA Transformational Leadership Award winner.  CHIME gives this award once a year to an organization that has excelled in developing and deploying transformational IT that improves the delivery of care and streamlines administrative services.

An impressive institution indeed, TMC is made up of two academic acute care facilities with a total of 600 beds, a Michael Weilert MD Truman Medical Centersbehavioral health program, over 50 outpatient clinics, the Jackson County health department and a long-term care facility.  TMC serves a large number of low income, high-risk patients, providing 11% of all uncompensated care within the state of Missouri.  TMC is also a participant in the Partnership for Patients, which was established by the Centers for Medicare & Medicaid Services (CMS) to make hospital care safer, cheaper and more reliable.  Over the past few years, the organization has launched a system-wide initiative called the Q6, designed to drive quality improvement at TMC.  Q6 then led to the formation of multidisciplinary committees to help drive quality improvement across clinical workflow, IT and business processes using actionable data from the organization’s electronic health record (EHR).

TMC has also been named a 2014 HIMSS Enterprise Davies Award recipient, which promotes EHR-enabled improvement in patient outcomes through sharing case studies and lessons learned on implementation strategies, workflow design, best practice adherence and patient engagement.  Through a comprehensive EHR-enabled quality improvement strategy that focuses on adhering to the best practice protocols and heavy technology-enabled patient engagement strategies, TMC have sustained exceptional care coordination while maintaining an exceptional level of care delivery that ranks significantly higher above national benchmarks.

Through their use of an EHR-enabled automated interpreter requests and streamlined workflow, TMC has been able to provide a more personalized care experience for each patient while simultaneously providing proper care.  This has resulted in a significant reduction in the number of episodes of venous thromboembolism and hospital-acquired pressure ulcers, resulting in nearly $8 million in reduced costs.  Since much of TMC’s care is uncompensated, reducing costs is an essential part of their model.

 

Paper-Based Medical Technology

Michael weilert MD synthetic biology

Recently, researchers in synthetic biology have been bringing together science, engineering and computing to understand and copy biological life to help achieve new breakthroughs.  There were recently two studies published in the journal Cell, which show how advances in synthetic biology could eventually lead to cheap, reliable diagnostics for diseases such as Ebola, which could be done quickly in the field using only drops of blood or saliva on strips of paper embedded with synthetic biology circuits.

In the first of these studies, scientists from Harvard describe how they brought lab-testing ability to pocket-sized slips of paper by embedding them with synthetic gene networks.  They also spoke about how they created various diagnostics, including strain-specific Ebola virus sensors.  Until recently, progress in synthetic biology has been limited, since scientists were only able to develop synthetic mechanisms within living cells.  However, the research team was able to create a system that allowed them to design synthetic versions of biological mechanisms outside of cells.  The researchers explain how they’ve harnessed the genetic machinery of cells, and then embedded them in the fiber matrix of paper, which can then be freeze-dried for storage and transport, allowing researchers to take synthetic biology out of the lab setting and use it anywhere.

Through their work, the researchers have developed a wide range of diagnostics and biosensors, which incorporate proteins that fluoresce and change color to show that they’re working.  Once they’ve been freeze-dried, these paper-based tools can be stored for up to a year.  To be activated, all you need to do is add water.  When used in a laboratory, this technology allows researchers to save both time and costs compared to conventional methods; certain procedures that would typically take between 2 and 3 days can now be done in as little time as 90 minutes.

For their second study, the researchers created an Ebola sensor through the “toehold switch”, a flexible and highly programmable system for controlling gene expression.  While the toehold switch was originally used to work inside living cells, the team was able to change its function to their signature freeze-dried paper method.  The toehold switch can be programmed to switch on the production of a specific protein after detecting the proper sequence of genetic code.  According to the team, it’s also possible to link multiple toehold switches to each other and create a complex circuit to carry out a series of steps, such as detecting a pathogen and then delivering the appropriate therapy.

 

The Effectiveness of Health Apps

Michael Weilert MD Health Apps

Various health apps

These days, it seems like there’s a mobile app for just about everything.  Currently, nearly 20% of smartphone users have at least one app on their phone that allows them to track and/or manage their health, and by next year, it’s been estimated that 500 million smartphone users around the world will be using one.  They allow us to monitor nearly every factor that impacts health, including weight, blood pressure, exercise, cholesterol levels, heart rate, sleep quality and even help detect cancer.  While these seem to have some great benefits, there are a lot of people who still have mixed feelings about these apps.

Last year, the IMS Institute for Healthcare Informatics analyzed over 40,000 health care apps, and discovered that only 16,275 of these are directly linked to patient care and treatment, while others do nothing more than provide information that doesn’t improve patient health or well-being in any way.  The apps that are downloaded most frequently claim to help with dieting, weight loss and fitness, such as MyFitnessPal, which generated 40 million users last year alone.  However, the report from IMS claims that this app’s effectiveness didn’t meet its popularity.  They pointed out that very few studies show how effective calorie-counting apps are.  A study from the University of Massachusetts Medical School had similar findings.  The team found that 25% or fewer lifestyle-based strategies for weight loss, such as portion control and identifying reasons behind overeating, were only incorporated in 28 of the apps, which means that they were most likely not at all effective for weight loss.

These results reveal that many app developers aren’t even including proven behavioral strategies in their apps, and without long-term data on whether these apps work, most doctors are hesitant to recommend them as an effective solution for poor eating habits.  However, not all research condemns the effectiveness of such apps; one 2012 study from Northwestern University claimed that an app that tracked eating and physical activity helped users lose 15 pounds and keep the weight off for at least a year.  However, the team admitted that the app was only effective when used alongside other weight loss support, such as nutrition and exercise classes.

While some weight loss apps are ineffective, others could actually be detrimental to health; last year, researchers from the University of Pittsburgh Medical center questioned the accuracy of four health apps that claim to detect skin cancer.  The team discovered that even the most accurate of cancer-detecting apps missed 18 of the 60 lesions diagnosed as melanoma, having deemed them “low-risk” for cancer.  Of course, these apps state that they’re only designed for educational purposes, and shouldn’t take the place of actual medical care, but researchers are nonetheless worried about these findings.  The amazing thing is that you don’t need to be a medical professional, or even source medical input to develop a health app, which many find disconcerting.

 

Neanderthal Study

For a long time, people believed that the Neanderthals had died out shortly after Cro-Magnon, the ancestors of modern-day humans, showed up.  However, an international study led by Oxford University suggests that the two species overlapped for up to 5,400 years, allowing plenty of time for the two cultures and genes to mix with one another.  For the 6-year study, researchers used improved methods of radiocarbon dating to analyze around 200 different samples of bone, shell and charcoal from 40 important archaeological sites around Europe, from Spain to Russia.  These were chosen because they either showed signs of Neanderthal tool-making or because they contained stone tools that were thought to be from early modern humans or Neanderthals.

Michael Weilert MD Neanderthal

Despite what you might have thought, this is actually NOT Mick Jagger, but rather a reconstructed Neanderthal wearing modern clothing.

With the help of mathematical formulas, the team compared the new radiocarbon data with previous findings from studying rock layers to piece together the chronology of the findings.  The results reveal that Neanderthals disappeared from Europe between 41,030 and 39,260 years ago, long after early modern humans arrived on the scene.  According to the authors, this means that Neanderthals and early modern humans must have overlapped for a significant amount of time, which would give them plenty of time to interact and interbreed with each other.  Nonetheless, the researchers emphasize that they weren’t able to figure out exactly where interbreeding may have occurred in Europe, and whether or not it happened in isolated incidents or repeatedly.  The researchers say that up to 2% of DNA in today’s non-African humans originally comes from the Neanderthals, which suggests that the two groups did interbreed outside of Africa.

Evidence also hints that Neanderthals may have survived in dwindling populations in pockets of Europe before they finally became extinct.  Previous techniques for obtaining radiocarbon dates may have resulted in underestimates of the age of Neanderthal samples, which in turn could have been contaminated with modern material.  The researchers used ultrafiltration methods which purify the extracted collagen from bone, which would help avoid the risk of modern contamination.  This means that the researchers can confidently say that they’ve finally resolved the timing of the disappearance of the Neanderthals.  Of course, the Neanderthals aren’t entirely extinct, since they make up part of the genetic makeup of many modern humans.