Studying Our Genes to Understand the Impacts of Covid-19

Studying Our Genes to Understand the Impacts of Covid-19

Studying Our Genes to Understand the Impacts of Covid-19
Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

Introduction

Here at Altheia Predictive Health, we analyze several factors when making disease predictions; one of these factors is family history and DNA markers which are parts of looking at genetic predispositions. Some companies dive deeper into DNA markers as a primary indicator for other conditions, such as how the BRCA1 gene relates to breast cancer in women. Similarly, genetics is now being used to understand if certain people may be at a higher risk of suffering serious complications from Covid-19 due to certain genetic expressions. Studies are also being conducted to understand if and how Covid-19 may change or mutate genetic expressions. In this article, we will take a look at some of the research being performed in this area. 

Discussion

When it comes to looking at gene expression as it relates to Covid-19, researchers are not simply looking at who is most likely to test positive for the disease because many people are simply asymptomatic carriers. Rather, they are looking at gene expressions related to patients who have dealt with severe symptoms, or even passed away, as a result of Covid-19. A recent study published in the scientific journal, Nature Research, discussed a genetic association study [that] identified a gene cluster on chromosome 3 as a risk locus for respiratory failure after infection with severe acute respiratory syndrome coronavirus 2.” The study went one step further and mapped this genetic expression to find that it was actually inherited from Neanderthals and “is carried by around 50% of people in south Asia and around 16% of people in Europe.” 

Another study, out of the University of Edinburgh, went a step further to better understand the strength of impact this marker and found that “because 74% of patients [with the marker] were so sick that they needed invasive ventilation, it had the statistical strength to reveal other markers, elsewhere in the genome, linked to severe COVID-19; and that a single copy of the associated variant more than doubles an infected person’s odds of developing severe COVID-19.” 

For those who survive a tough battle against Covid-19, as well as for those who test positive but are asymptomatic, the question remains of whether or not the disease can cause long term damage to our genes. Google recently granted researchers at the University of North Carolina – Chapel Hill $500,000 to study if and how Covid-19 alters gene expression. To conduct this research, researchers will compare RNA, a marker of gene expression, from the blood collected over years from the same individuals before and after COVID-19 infection [and]… use artificial intelligence tools to scan the genome for changes in gene expression that may be due to COVID-19 infection.”  

Conclusion 

Similar to many other health issues, our genetics can clearly play a huge role in how successfully we battle a disease such as Covid-19. What is promising about the research being conducted in this area is that once we better understand who is at the highest risk, we can better protect them and even create gene therapies to prevent severe symptoms or death due to Covid-19 for these people. However, just because you are not at risk of suffering from severe symptoms is no sign to put away your mask – Covid-19 has already been shown to have long term effects on those who test positive while asymptomatic and further studies will help us understand the severity of how the disease alters our genetic expression.

 

 

How Analytics and Technology Can Enable and Improve Patient Engagement

How Analytics and Technology Can Enable and Improve Patient Engagement 

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

Introduction

When it comes to our medical data, many people simply go to their annual checkups and hear feedback from their doctors; this is often a very one sided experience and we know that for people to be healthy, they must take an active role in their health. Engaging in our health can mean different things for different people – for those who are very health, engaging in our health could mean tracking nutrition and dietary decisions or exercise routines and for those suffering from chronic conditions, such as diabetes, this can mean using apps that track health metrics to improve the understanding of your own body. In this article we will take a look at a few companies who are seeking to improve patient engagement to improve the health of its users. 

Body 

In this day and age, nearly everyone has a smartphone and from that phone they conduct almost all of their communication, check the weather, secure their homes, do banking and more. It is only natural then that they could also utilize this device to enhance their health. Apps are a great way to increase patient engagement because of their accessibility and many providers have realized this. MyChart is widely used by providers to communicate test results and ranges, appointment summaries and other relevant health information. This is an important tool for patients to have because it enables them to always have their medical records on hand as either reminders for themselves or supplemental information for nutritionists or trainers. MyChart also enables patients to have a direct line of communication with their doctors to ask any pressing or important questions. Another great app available to consumers to mySugr; this app was created to help diabetics track, understand and control their blood sugar levels. The app lets users log their blood glucose and insulin levels, their medication list and dosages, as well as meals from which they derive an estimated carb intake. All of these factors are key to keeping diabetic patients in good health and mySugr utilizes these inputs to create detailed reports and health analysis for patients, as well as takeaways to provide your physician. A great app for those suffering from cardiovascular issues is Kardia – Kardia is integratable with health devices such as EKG and blood pressure devices to analyze and log your EKG results. With continued time and use, the app will learn your body’s normal readings and note when abnormalities show up, as well as when those abnormalities seem serious enough to contact a physician. The app also creates concise reports to share with your providers. 

Conclusion

As technology continues to improve the world around us, it is amazing that it can also improve the functions that happen within us. Analytics and apps make improved health easy to access for many people and, in many cases, at no additional cost which means that everyone should consider incorporating such apps into their lives.

 

Analyzing Air Pollution and its Effects on Our Health

Analyzing Air Pollution and its Effects on Our Health

Analyzing Air Pollution and its Effects on Our Health

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

Introduction 

As wildfires in California continue their path of destruction through the Golden State, Americans across the country are feeling the effects of their destruction as smoke makes its way up to 2,500 miles across the country. The widespread effects of smoke pollution are well documented as being dangerous to health, however, it is only recently that the power of data analytics has given us more insight into the issue. 

Body 

China is a country with a huge carbon footprint and the impact of its economic decisions has created thick smog in its most populated cities that have led citizens to wear protective masks for years. A study out of Shenzhen has created a stepping stone for further research in this area by creating a proposed model that looks at: 

(i) estimating high resolution concentrations of air pollution with big data analytics based on enormous structured and unstructured data

(ii) quantifying the health effects of both single pollutant and pollutant mixtures

(iii) designing the personalized health advisory model based on individual characteristics and exposure information

 Another study out of Amity Institute of Information Technology led to a unique model workflow:

 

The takeaways from these models are that the data extraction methods for this type of research are twofold – these models would focus on both spatiotemporal and medical data inputs to create relationships between these points, at which point machine learning can be applied to understand how certain environmental events can impact both the environment and human health. 

The biggest roadblocks that occur when creating these types of models is that no two environmental effects are ever the same and there are many different factors that make that the case. Wildfires in Arizona are very different from wildfires in California and while we can hope that machine learning will differentiate and adapt to them for us, there are variables that cause these differences that may be left out. For example, humidity can exacerbate natural disasters, as can the terrain of the area in which they happened. The number of houses and cars, types of materials present in the area and so many other factors can influence these models such that it is very possible for researchers to not be able to consider them all. 

Conclusion

The most impactful conclusion we can draw from the California wildfires is that their detrimental effects on our health and planet would have been drastically lower had precautions been taken in terms of climate change. Climate change has created an environment that helps these wildfires thrive and makes it significantly more difficult to quell their flames. Precautions we can take for our own health include investing in air purifiers, staying indoors when possible and wearing protective masks when outside. Precautions we can take to prevent further damage to our planet include investing in alternative energy sources and lowering our carbon footprint through making more sustainable decisions such as shopping locally, recycling and carpooling; more information on lowering your carbon footprint can be found here.

 

Using Big Data to Improve the Lives of Liver Disease Patients

Using Big Data to Improve the Lives of Liver Disease Patients

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

 

Introduction 

The liver is an extremely important organ in our bodies and is responsible for several tasks; of its many functions the most important tasks is its filtration and detoxification functions. The liver also helps regulate blood sugar, breakdown fat, produces and recycles blood at certain points in our lives, and many, many other supporting functions. Clearly, the liver performs key actions to keep our bodies healthy and functioning so anything that decreases its capabilities, such as liver disease, is important to look at. Liver disease is a very broad term and refers to any issue that affects the liver’s ability to function. Liver issues are classified as a disease when 75% or more of the liver tissue is affected which is when decrease in function begins. It can be caused by one or many factors – infections such as hepatitis, autoimmune diseases, certain cancers, genetics, alcohol abuse and increased fat accumulation can all play a role in the onset of liver disease. The use of data to analyze these factors, as well as metabolic factors, is very key to improving the diagnosis and management of liver disease. 

Discussion 

There are a few quantifiable factors that are related to liver disease; those factors include: Age,Gender, Total Bilirubin, Direct Bilirubin, Alkaline Phosphatase, Alanine Aminotransferase, Aspartate Aminotransferase, Total Proteins, Albumin and Albumin and Globulin Ratio. All of these variables have been positively correlated to the presence of liver disease and would be important factors for any algorithm that looks at liver disease or other diseases that compromise liver function. The Global Journal of Computer Science and Technology published a report in 2010 that looked at how to apply statistical modeling and machine learning to the study of liver disease. They used three different supervised algorithms – Naive Bayes, KStar and FT Trees – to predict the accuracy of liver disease diagnoses and found that FT Trees provided the highest level of accuracy at 97.10%. Such a high accuracy rate provides a solid foundation for other researchers to add in new variables and factors to further improve that rate. Also in this area of research is a project led by Harvard Medical School, Massachusetts General Hospital and Georgia Institute of Technology that has set out to further understand the effects of alcohol in relation to liver issue related deaths. The project begins by modeling drinking patterns against alcohol based liver issues in patients born from 1900-2012 and then utilized different intervention scenarios to see if reducing alcohol consumption also lowered chances of liver issue related deaths. The findings, in line with most practical medical advice, found that liver function decreased with alcohol consumption and its functionality improved more with each intervention that happened sooner than later.  

Prevention

The most important thing you can do to keep your liver healthy is to eat healthy foods and live a healthy lifestyle, including reduction of alcohol consumption. There are several genetic factors that, unfortunately, cannot be mitigated, however, having genetic tests done so that you are aware of any increased risk you may have is critical and can help you determine how much you may need to adjust your lifestyle to accommodate those factors. 

Conclusion 

Similar to many other diseases, liver disease is quantifiable in many ways. There are already several promising studies that have created a solid foundation for further research in the area of machine learning applications to this field of study; additionally, there are also promising studies that show the importance and influence of lifestyle interventions in changing the success rate of patients suffering from liver disease.

 

 

Putting Data Behind Parkinson’s Disease

Putting Data Behind Parkinson’s Disease

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

Introduction 

Over one million people living in the United States suffer from Parkinson’s disease – more than the number of people suffering from multiple sclerosis, muscular dystrophy, and amyotrophic lateral sclerosis combined. Parkinson’s disease is a progressive disease affecting the nervous system that leads to stiffness and slowing of movement due to nerve cells in the brain breaking off or loss of neurons. Symptoms in the early stages of the disease can include changing facial expressions, slurred or soft speech and other minor changes in the person’s ability to move normally and easily. As the disease worsens, one can develop a tremor, suffer from rigid muscles and impaired balance, and loss of automatic movements. The cause of Parkinson’s disease is unknown but early research suggests certain gene variations can increase one’s risk of having Parkinson’s, as well as the possibility that certain toxins can trigger the onset of Parkinson’s. Treatment for Parkinson’s disease is also fairly costly, and its side effects, such as decreased cognitive abilities, can often further decrease a patient’s ability to shoulder the associated costs. Clearly, there is huge importance to furthering the study of Parkinson’s disease and one of the ways to improve such studies is to use analytics tools in disease analysis.

Discussion

There are several ways in which analytics can be used to benefit those suffering from Parkinson’s disease; because the nature of the symptoms and effects of Parkinson’s is mainly physical, a lot of the metrics used in caring for patients revolve around movement. For example, activity trackers used for Parkinson’s patients include algorithms that detect abnormalities in walking patterns such as “tremor, dyskinesia, asymmetry, festination, and freezing.” These algorithms can also study the level of activity and then use this information to understand how a patient’s walking patterns and habits are changing. Tremors are a common symptom of Parkinson’s and can be observed and measured using spectral analysis; measuring Parkinson’s tremors can be helpful because such “episodes are correlated to medication intake events” and doctors can adjust medication consumption as necessary based on the observed data. Furthermore, as technology continues to evolve rapidly, patients may be able to understand and adjust their medication accordingly on their own.  Finally, because many people with Parkinson’s experiences sleep disturbances such as “insomnia, periodic limb movement disorder and REM-sleep disorder,” combining a generic sleep study and fait pattern studies and applying data science tools can provide a more accurate analysis of sleeping habits for Parkinson’s patients. 

Another promising step in the way of technology and analytics supporting the lives of people suffering with Parkinson’s disease can be seen in the use of wearable technology to evaluate symptoms of Parkinson’s. Intel Corporation, in partnership with the Michael J. Fox Foundation, proposed a program in 2014 to develop a wearable tracking watch that could conveniently collect and record patient information. From these records, machine learning techniques could be applied to understand and assess the progression of a patient’s symptoms and help providers adjust care management methods or medication dosages. 

Conclusion 

Parkinson’s disease is a condition that affects many Americans and many more across the globe. While no treatment to cure Parkinson’s exists, there are care management options available and applying data science tools to these options can significantly improve a patient’s quality of life. Additionally, the creation of tools such as smart watches can further improve the quality and quantity of data available to perform these studies.

 

 

Small Communities Case Study

Small Communities Case Study

How Analytics Can Impact and Improve the Health of Small Communities (+ Case Study)

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

Introduction 

Community health information can be very revealing; there is a lot to learn from the data of a specified community whether that specification be location, race, gender, occupation, age, or income bracket. However, these are relatively large communities and while that data is incredibly important, small community analytics can be even more targeted and actionable due to the ability to better communicate information to smaller groups. 

Discussion 

The biggest struggle in studying small communities is validity – many statisticians have argued that small community studies do not meet the benchmarks of a sample size to be referenced in other studies and this is a valid concern. However, if you don’t approach smaller studies as ones to be distributed, they can be especially impactful for their communities. Below, we’ve attached a case study from Cerner to demonstrate the importance of small community studies:

 

 

Cerner Case Study

 

Every year, approximately 735,000 Americans have a heart attack. There’s great interest in improving this number — and one of the ways we can contribute to that goal is by quickly identifying symptoms of and treating heart attacks. Troponin tests are commonly used in the emergency department (ED) to identify if a patient is experiencing a heart attack. In an ideal setting, the turnaround for a troponin test is about 35 minutes; most hospitals have a protocol setting of 60 minutes or less.

We recognized an opportunity for improvement with some of our clients around their troponin test rates. We pulled data on individual clients and compared it to industry wide data, and found that while some of our clients had fantastic numbers, others hadn’t had a focus group around this topic and there was room for improvement. If a hospital’s median turnaround time for a troponin test is 45 minutes, for example, that still means that approximately half their tests are taking longer than that.

Though there is currently no troponin test standard mandated by the Centers for Medicaid and Medicare Services (CMS), the turnaround time clearly impacts patient care. Think of it this way: The 25-minute difference in test results is akin to an ambulance arriving to pick up an individual with heart attack symptoms and then simply waiting in the driveway for over half an hour. 

Conclusion 

The study of the health of small communities can seem a little slow but actually has the potential to be extremely interesting! Small community studies can identify several localized issues ranging from issues in infrastructure to specified malnutrition to minor disease networks. Here at Altheia Predictive Health, we are specifically trying to make the research and benefits of big data accessible to small communities through our app. If you know a small company or organization that could benefit from our research, please send them our way!

 

The Most Personalized and Precise Form of Healthcare: a Discussion of the Genome

The Most Personalized and Precise Form of Healthcare: a Discussion of the Genome

The Most Personalized and Precise Form of Healthcare: a Discussion of the Genome

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

Introduction 

Our genomes are essentially a personalized index or library of everything we are – they are the combination of genes and DNA that hold all of our genetic information. The field of genomics is relatively new with most studies citing its roots in the 1980’s. In fact, The Human Genome Project only began development in 1990 and was declared complete in 2003. Though new, the field of genomics, like many other fields of study touched by technology, has evolved rapidly. For context, processing a human genome would have cost $20-25 million in 2006 compared to a cost of well below $1,000 today and its market has seen growth from $1 billion to $4.5 billion in the last 8 years alone. Furthermore, the first time a human genome was sequenced took 3 years of processing power while today a human genome can be processed in less than 3 days. The increased accessibility to genomic information is an incredibly important development in terms of preventative care and can be a life-saving step for many people.

Discussion

The Precision Medicine Initiative “is an emerging approach for disease treatment and prevention that takes into account individual variability in genes, environment, and lifestyle for each person [whose] approach will allow doctors and researchers to predict more accurately which treatment and prevention strategies for a particular disease will work in which groups of people.” This is where genomics finds its highest level of applicability and importance. Genomics and the power that we now have in machine learning can provide incredible insights into which genes are relevant to certain diseases. With that insight, patients can adjust their lifestyle to their risk factor and have a much better understanding of where they stand in terms of their health. Doctor’s also have a much clearer idea of what tests may need to be run a lot earlier in a patient’s life and what tests may never need to be run unless an event occurs prompting questions outside of the norm. Overall, genomics can save a lot of time and money for both providers and patients alike.

One of the struggles with sequencing an  individual’s entire genomic profile is the sheer processing and holding power needed to execute algorithms on an entire sequence – massive database space is necessary to perform these types of analytics. However, one consideration that can be factored into account to make genomics even more accessible is isolated sequencing. For example, if someone already knows their family has a high risk of a certain disease, they may choose to only sequence parts of their DNA, such as the BRCA1 and BRCA2 genes sequenced for those individuals with a higher risk of breast cancer. This methodology can be applied to any genetically passed disease. However, the ultimate hope and goal for many is that genome sequencing becomes accessible enough so anyone can sequence their entire genome. This could then be utilized by healthcare providers who can provide a much more personalized approach to a patient’s diagnosis and care plans with that information. 

Conclusion 

In comparison to many other fields of study, genomics is very new, however, that hasn’t stopped it from catching up (and even outrunning and outshining) many other fields in terms of accessibility. When we look at communities, whether that be by location, ethnicity, age or gender, we get a much clearer picture of how the health of a population is influenced. As accessibility to the technology used to support genomics increases for patients, we can expect that picture to get even clearer and to see an even more personalized approach to healthcare.

 

Improving the ROI of EHRs Through Analytics

Improving the ROI of EHRs Through Analytics

Improving the ROI of EHRs Through Analytics

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

Introduction 

As with any business feature, business owners and analysts must consider whether or not the cost associated with a feature is worth the return on investment. Electronic Health Records, though an integral part of the healthcare system, are a good example of a staple that does not often justify its cost. EHRs, of course, are extremely beneficial within healthcare, however, their high implementation and maintenance costs (think billions of dollars) means that they are not necessarily worth the investment unless steps are taken to optimize their use. Due to the fact that EHRs are ingrained into our healthcare system, the question isn’t whether or not they are worth the investment but how we can make them worth the investment? 

Discussion 

The first and most documented issues with EHRs is their accessibility and readability which limits their usability as well. Additionally, “EHR reports tend to run on a predetermined schedule, limiting how the data within the EHR can be used to evaluate key performance indicators, populations studies, or long-term trends” which further limits their ability to be improved upon. Many investors and market researchers say that the next step in EHR improvement is to invest heavily in programs and softwares that are able to translate the data from EHRs to other softwares so that it can be used across different contexts. This development will allow the power of analytics to significantly improve the return on investment for EHRs by providing insight and direction in terms of  bed management, case management, ED, workforce management, scheduling, and OR management systems [such that] staff can see the upstream and downstream effects of a single operational decision.” This is important because the time it takes to “translate” EHR data means that time has passed since data was collected and, in healthcare, real-time insights and decisions can be critical. Once the issue with readability and context application is solved, EHRs can be used to support predictive analytics endeavors by providing on-demand trend analysis and suggested steps to be verified by physicians. Such insight can cut costs for hospitals by tracking patient flow, for providers by creating demographic reports and for patients by reducing the number of tests needed for diagnosis. 

Conclusion 

The newest development in this space is a Google study’s use of deidentified EHRs to make patient health predictions. Though this project is still in the proof of concept phase, their prediction models have outperformed standard hospital models in every test thus far. This is a promising development in the optimization of EHR use that could encourage further research from smaller companies and at the university level, as well as inspire further investments towards the effort of getting the most out of Electronic Health Records. 

 

 

TeleTracking Technologies

Concerns Regarding the Trump Administrations Contract with TeleTracking Technologies

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

 

Introduction

A few weeks ago, when the Trump Administration took Covid-19 reporting responsibilities away from the CDC, there were several questions about how data would be processed and whether or not the public could trust the accuracy of new data. Not long after that development, the Trump Administration awarded a company called TeleTracking Technologies a multi-million dollar contract to collect and report on Covid-19 data. However, inconsistencies in reporting and a lack of transparency in collection methods has raised a lot of questions regarding teletracking as a process and TeleTracking Technologies as a company. 

Discussion

One of the biggest causes for concern of TeleTracking Technologies is that they have refused to answer questions regarding Coronavirus data from United States senators due to a nondisclosure agreement with the Trump Administration. This is heavily concerning because it limits the scope of power of other branches of government outside of the executive branch. Lawyers for the company refuse to disclose how TeleTracking Technologies collects and shares its information; in our last article, we heard from physicians and hospital administrators who are extremely concerned that the process in which Covid-19 data will be skewed towards supporting the Trump Administration’s political goals and, given President Trump’s close ties to the founder and CEO of TeleTracking Technologies, this notion does not seem outside of the realm of possibility. This move has been highly criticized by researchers and academics who cannot accurately conduct their own research without transparent data collection and reporting practices.

Finally, a huge concern is that these policy and process changes are coming abruptly and at an awful time. Carrie Kroll, with the Texas Hospital Association, says that “Up until the switch, we were reporting about 70 elements and we’re now at 129… clearly we’re in the middle of a pandemic… this isn’t the type of stuff you try to do in the middle of a pandemic.” Hospitals have been reporting to the CDC with standard practice for over 15 years which means that these changes are a painfully challenging process to push onto hospitals while the Covid-19 Pandemic continues to plague the United States. 

Conclusion 

The transfer of Covid-19 data reporting responsibilities from the CDC to TeleTracking Technologies is ultimately an irresponsible move on the part of the Trump Administration who have put physicians and patients at a disadvantage by pursuing a path that limits the transparency of data to the general public. However, it is our new reality and if we cannot rely on our government to provide reliable data then we can hope that efforts from private companies, such as IBM, can provide researchers and physicians with trustworthy data.

 

 

Can Technology Boost Efficiency in Healthcare? Plus, A Look at Companies Leading the Way

Can Technology Boost Efficiency in Healthcare? Plus, A Look at Companies Leading the Way

Can Technology Boost Efficiency in Healthcare? Plus, a Look at the Companies Leading the Way

Authored by Ayesha Rajan, Research Analyst at Altheia Predictive Health

 

Introduction

It is currently estimated that anywhere between 20% and 50% of the U.S. healthcare system costs are due to inefficiency. The troubling part of this statistic is that money could be going in several other places such as investments into healthcare startups or preventative care plans. These excess costs directly affect consumers who, in the field of healthcare, are also patients. However, we now live in a time of increased technological capabilities and, paired with the power of analytics, it can help us decrease redundant care, improve transitions of care and advance provider-provider and provider-patient communication in order to decrease cost waste in the healthcare industry.

Discussion

One of the biggest cost concerns in the healthcare field is communication – paper, phone calls and faxes are all big contributors to inaccuracies and miscommunications. Albert Santalo, the founder of CareCloud, believes technology can bridge a huge gap here. CareCloud is a cloud-based electronic health record provider that hopes to cut the costs of inefficiency by creating a platform that allows for cross communication between providers, billing and consumers. Another big concern in cutting costs in healthcare is the timely entry of data. Hill-Rom Holdings is another company making great strides in the field of healthcare with their smart hospital beds. Their beds ensure that vital signs are entered and time stamped immediately, rather than up to hours later when providers get a chance to enter data into their system. This technology saves money in healthcare by providing physicians with the ability to make accurate and timely decisions regarding a patient’s care plan. Eventually, analytics can further support the goal of accurate decisions further by applying machine learning techniques to the smart bed. 

Another use for analytics in healthcare is to improve the timeliness of when a patient is transferred from the ICU. The current system for transferring ICU patients is reactionary and subject to error but applying analytics to this process can not only reduce costs but can also prevent deaths. This is because opening up ICU beds can make space for those who need it more and because some patients may receive better, more specialized care in another unit.

Conclusion

When it can be estimated that up to half of an industry’s costs are due to waste and inefficiency, it is clear that something should change; when that industry is the healthcare industry, it is clear that something needs to change. There should be no room in healthcare for waste or inefficiency because this is an industry that deals with people’s livelihoods and well-being. Thankfully, the rise of analytics and technology helps create cost-solutions that prevent waste and inefficiency and can improve the lives of patients across the nation.