Putting Data Behind Parkinson’s Disease

Putting Data Behind Parkinson’s Disease

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

Introduction 

Over one million people living in the United States suffer from Parkinson’s disease – more than the number of people suffering from multiple sclerosis, muscular dystrophy, and amyotrophic lateral sclerosis combined. Parkinson’s disease is a progressive disease affecting the nervous system that leads to stiffness and slowing of movement due to nerve cells in the brain breaking off or loss of neurons. Symptoms in the early stages of the disease can include changing facial expressions, slurred or soft speech and other minor changes in the person’s ability to move normally and easily. As the disease worsens, one can develop a tremor, suffer from rigid muscles and impaired balance, and loss of automatic movements. The cause of Parkinson’s disease is unknown but early research suggests certain gene variations can increase one’s risk of having Parkinson’s, as well as the possibility that certain toxins can trigger the onset of Parkinson’s. Treatment for Parkinson’s disease is also fairly costly, and its side effects, such as decreased cognitive abilities, can often further decrease a patient’s ability to shoulder the associated costs. Clearly, there is huge importance to furthering the study of Parkinson’s disease and one of the ways to improve such studies is to use analytics tools in disease analysis.

Discussion

There are several ways in which analytics can be used to benefit those suffering from Parkinson’s disease; because the nature of the symptoms and effects of Parkinson’s is mainly physical, a lot of the metrics used in caring for patients revolve around movement. For example, activity trackers used for Parkinson’s patients include algorithms that detect abnormalities in walking patterns such as “tremor, dyskinesia, asymmetry, festination, and freezing.” These algorithms can also study the level of activity and then use this information to understand how a patient’s walking patterns and habits are changing. Tremors are a common symptom of Parkinson’s and can be observed and measured using spectral analysis; measuring Parkinson’s tremors can be helpful because such “episodes are correlated to medication intake events” and doctors can adjust medication consumption as necessary based on the observed data. Furthermore, as technology continues to evolve rapidly, patients may be able to understand and adjust their medication accordingly on their own.  Finally, because many people with Parkinson’s experiences sleep disturbances such as “insomnia, periodic limb movement disorder and REM-sleep disorder,” combining a generic sleep study and fait pattern studies and applying data science tools can provide a more accurate analysis of sleeping habits for Parkinson’s patients. 

Another promising step in the way of technology and analytics supporting the lives of people suffering with Parkinson’s disease can be seen in the use of wearable technology to evaluate symptoms of Parkinson’s. Intel Corporation, in partnership with the Michael J. Fox Foundation, proposed a program in 2014 to develop a wearable tracking watch that could conveniently collect and record patient information. From these records, machine learning techniques could be applied to understand and assess the progression of a patient’s symptoms and help providers adjust care management methods or medication dosages. 

Conclusion 

Parkinson’s disease is a condition that affects many Americans and many more across the globe. While no treatment to cure Parkinson’s exists, there are care management options available and applying data science tools to these options can significantly improve a patient’s quality of life. Additionally, the creation of tools such as smart watches can further improve the quality and quantity of data available to perform these studies.

 

 

Small Communities Case Study

Small Communities Case Study

How Analytics Can Impact and Improve the Health of Small Communities (+ Case Study)

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

Introduction 

Community health information can be very revealing; there is a lot to learn from the data of a specified community whether that specification be location, race, gender, occupation, age, or income bracket. However, these are relatively large communities and while that data is incredibly important, small community analytics can be even more targeted and actionable due to the ability to better communicate information to smaller groups. 

Discussion 

The biggest struggle in studying small communities is validity – many statisticians have argued that small community studies do not meet the benchmarks of a sample size to be referenced in other studies and this is a valid concern. However, if you don’t approach smaller studies as ones to be distributed, they can be especially impactful for their communities. Below, we’ve attached a case study from Cerner to demonstrate the importance of small community studies:

 

 

Cerner Case Study

 

Every year, approximately 735,000 Americans have a heart attack. There’s great interest in improving this number — and one of the ways we can contribute to that goal is by quickly identifying symptoms of and treating heart attacks. Troponin tests are commonly used in the emergency department (ED) to identify if a patient is experiencing a heart attack. In an ideal setting, the turnaround for a troponin test is about 35 minutes; most hospitals have a protocol setting of 60 minutes or less.

We recognized an opportunity for improvement with some of our clients around their troponin test rates. We pulled data on individual clients and compared it to industry wide data, and found that while some of our clients had fantastic numbers, others hadn’t had a focus group around this topic and there was room for improvement. If a hospital’s median turnaround time for a troponin test is 45 minutes, for example, that still means that approximately half their tests are taking longer than that.

Though there is currently no troponin test standard mandated by the Centers for Medicaid and Medicare Services (CMS), the turnaround time clearly impacts patient care. Think of it this way: The 25-minute difference in test results is akin to an ambulance arriving to pick up an individual with heart attack symptoms and then simply waiting in the driveway for over half an hour. 

Conclusion 

The study of the health of small communities can seem a little slow but actually has the potential to be extremely interesting! Small community studies can identify several localized issues ranging from issues in infrastructure to specified malnutrition to minor disease networks. Here at Altheia Predictive Health, we are specifically trying to make the research and benefits of big data accessible to small communities through our app. If you know a small company or organization that could benefit from our research, please send them our way!

 

The Most Personalized and Precise Form of Healthcare: a Discussion of the Genome

The Most Personalized and Precise Form of Healthcare: a Discussion of the Genome

The Most Personalized and Precise Form of Healthcare: a Discussion of the Genome

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

Introduction 

Our genomes are essentially a personalized index or library of everything we are – they are the combination of genes and DNA that hold all of our genetic information. The field of genomics is relatively new with most studies citing its roots in the 1980’s. In fact, The Human Genome Project only began development in 1990 and was declared complete in 2003. Though new, the field of genomics, like many other fields of study touched by technology, has evolved rapidly. For context, processing a human genome would have cost $20-25 million in 2006 compared to a cost of well below $1,000 today and its market has seen growth from $1 billion to $4.5 billion in the last 8 years alone. Furthermore, the first time a human genome was sequenced took 3 years of processing power while today a human genome can be processed in less than 3 days. The increased accessibility to genomic information is an incredibly important development in terms of preventative care and can be a life-saving step for many people.

Discussion

The Precision Medicine Initiative “is an emerging approach for disease treatment and prevention that takes into account individual variability in genes, environment, and lifestyle for each person [whose] approach will allow doctors and researchers to predict more accurately which treatment and prevention strategies for a particular disease will work in which groups of people.” This is where genomics finds its highest level of applicability and importance. Genomics and the power that we now have in machine learning can provide incredible insights into which genes are relevant to certain diseases. With that insight, patients can adjust their lifestyle to their risk factor and have a much better understanding of where they stand in terms of their health. Doctor’s also have a much clearer idea of what tests may need to be run a lot earlier in a patient’s life and what tests may never need to be run unless an event occurs prompting questions outside of the norm. Overall, genomics can save a lot of time and money for both providers and patients alike.

One of the struggles with sequencing an  individual’s entire genomic profile is the sheer processing and holding power needed to execute algorithms on an entire sequence – massive database space is necessary to perform these types of analytics. However, one consideration that can be factored into account to make genomics even more accessible is isolated sequencing. For example, if someone already knows their family has a high risk of a certain disease, they may choose to only sequence parts of their DNA, such as the BRCA1 and BRCA2 genes sequenced for those individuals with a higher risk of breast cancer. This methodology can be applied to any genetically passed disease. However, the ultimate hope and goal for many is that genome sequencing becomes accessible enough so anyone can sequence their entire genome. This could then be utilized by healthcare providers who can provide a much more personalized approach to a patient’s diagnosis and care plans with that information. 

Conclusion 

In comparison to many other fields of study, genomics is very new, however, that hasn’t stopped it from catching up (and even outrunning and outshining) many other fields in terms of accessibility. When we look at communities, whether that be by location, ethnicity, age or gender, we get a much clearer picture of how the health of a population is influenced. As accessibility to the technology used to support genomics increases for patients, we can expect that picture to get even clearer and to see an even more personalized approach to healthcare.

 

Improving the ROI of EHRs Through Analytics

Improving the ROI of EHRs Through Analytics

Improving the ROI of EHRs Through Analytics

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

Introduction 

As with any business feature, business owners and analysts must consider whether or not the cost associated with a feature is worth the return on investment. Electronic Health Records, though an integral part of the healthcare system, are a good example of a staple that does not often justify its cost. EHRs, of course, are extremely beneficial within healthcare, however, their high implementation and maintenance costs (think billions of dollars) means that they are not necessarily worth the investment unless steps are taken to optimize their use. Due to the fact that EHRs are ingrained into our healthcare system, the question isn’t whether or not they are worth the investment but how we can make them worth the investment? 

Discussion 

The first and most documented issues with EHRs is their accessibility and readability which limits their usability as well. Additionally, “EHR reports tend to run on a predetermined schedule, limiting how the data within the EHR can be used to evaluate key performance indicators, populations studies, or long-term trends” which further limits their ability to be improved upon. Many investors and market researchers say that the next step in EHR improvement is to invest heavily in programs and softwares that are able to translate the data from EHRs to other softwares so that it can be used across different contexts. This development will allow the power of analytics to significantly improve the return on investment for EHRs by providing insight and direction in terms of  bed management, case management, ED, workforce management, scheduling, and OR management systems [such that] staff can see the upstream and downstream effects of a single operational decision.” This is important because the time it takes to “translate” EHR data means that time has passed since data was collected and, in healthcare, real-time insights and decisions can be critical. Once the issue with readability and context application is solved, EHRs can be used to support predictive analytics endeavors by providing on-demand trend analysis and suggested steps to be verified by physicians. Such insight can cut costs for hospitals by tracking patient flow, for providers by creating demographic reports and for patients by reducing the number of tests needed for diagnosis. 

Conclusion 

The newest development in this space is a Google study’s use of deidentified EHRs to make patient health predictions. Though this project is still in the proof of concept phase, their prediction models have outperformed standard hospital models in every test thus far. This is a promising development in the optimization of EHR use that could encourage further research from smaller companies and at the university level, as well as inspire further investments towards the effort of getting the most out of Electronic Health Records. 

 

 

TeleTracking Technologies

Concerns Regarding the Trump Administrations Contract with TeleTracking Technologies

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

 

Introduction

A few weeks ago, when the Trump Administration took Covid-19 reporting responsibilities away from the CDC, there were several questions about how data would be processed and whether or not the public could trust the accuracy of new data. Not long after that development, the Trump Administration awarded a company called TeleTracking Technologies a multi-million dollar contract to collect and report on Covid-19 data. However, inconsistencies in reporting and a lack of transparency in collection methods has raised a lot of questions regarding teletracking as a process and TeleTracking Technologies as a company. 

Discussion

One of the biggest causes for concern of TeleTracking Technologies is that they have refused to answer questions regarding Coronavirus data from United States senators due to a nondisclosure agreement with the Trump Administration. This is heavily concerning because it limits the scope of power of other branches of government outside of the executive branch. Lawyers for the company refuse to disclose how TeleTracking Technologies collects and shares its information; in our last article, we heard from physicians and hospital administrators who are extremely concerned that the process in which Covid-19 data will be skewed towards supporting the Trump Administration’s political goals and, given President Trump’s close ties to the founder and CEO of TeleTracking Technologies, this notion does not seem outside of the realm of possibility. This move has been highly criticized by researchers and academics who cannot accurately conduct their own research without transparent data collection and reporting practices.

Finally, a huge concern is that these policy and process changes are coming abruptly and at an awful time. Carrie Kroll, with the Texas Hospital Association, says that “Up until the switch, we were reporting about 70 elements and we’re now at 129… clearly we’re in the middle of a pandemic… this isn’t the type of stuff you try to do in the middle of a pandemic.” Hospitals have been reporting to the CDC with standard practice for over 15 years which means that these changes are a painfully challenging process to push onto hospitals while the Covid-19 Pandemic continues to plague the United States. 

Conclusion 

The transfer of Covid-19 data reporting responsibilities from the CDC to TeleTracking Technologies is ultimately an irresponsible move on the part of the Trump Administration who have put physicians and patients at a disadvantage by pursuing a path that limits the transparency of data to the general public. However, it is our new reality and if we cannot rely on our government to provide reliable data then we can hope that efforts from private companies, such as IBM, can provide researchers and physicians with trustworthy data.