HIV Persistence and Evolution
Human immunodeficiency virus (HIV) infection is a serious international public health problem that affects over 30 million people worldwide and is associated with high rates of morbidity and mortality. Combination antiretroviral therapies (cART) are able to durably suppress measurable virus in the blood in a majority of HIV infected patients. As a result, HIV infection is now a chronic illness in patients with continued treatment access and excellent long-term adherence. Nevertheless, clinically significant resistance resulting in treatment failure is quite common among patients on cART. This is a serious problem, as there are a limited number of treatment options available; when a patient develops virus resistant to all available antiviral drugs, they are very likely to die early, either of HIV-related or antiviral toxicity-related disease. A large Australian study has shown that running out of available options is responsible for an average reduction in life-span of 6 years for patients diagnosed in their forties, or of 15 years for patients diagnosed in their twenties (J. Jansson, et al., AIDS, Dec. 2012). Consequently, any method that can reduce the incidence of treatment failure will have a significant impact on the life expectancy of HIV-positive patients.
Our research has focused on reducing the risk of treatment failure using information about the nonlinear dynamics of HIV persistence and evolution. Our research in this area has been funded by an R21 grant from the NIH NIAID. We first considered how to reduce the risk of failure associated with the high viral load present when a patient switches from a failing therapy. Antiviral therapy is theoretically capable of reducing the HIV turnover rate by several orders of magnitude, so drug-resistant virus that emerges within the first few years of treatment is statistically more likely to pre-exist the start of treatment than to develop during treatment. In a series of conference and journal publications, we showed that by reducing the viral load prior to introducing a set of naive antiviral drugs, the likelihood of treatment failure due to this mechanism could be reduced by orders of magnitude. We also showed that this could be accomplished either through the use of treatment interruptions or permuted schedules of previously failed treatments (Luo et al. PLoS One 2011, Luo et al. J. Process Control 2011).
In order to test the sensitivity of our methods to inter-patient variations in HIV dynamics, we needed to develop a reliable distribution of the HIV dynamic parameters. To accomplish this, we fit our dynamic model to patient data collected by our collaborators at the IRSI-Caixa AIDS research center in Barcelona, Spain. These data were from twelve patients in a treatment interruption experiment, and consisted of quantitative PCR viral load measurements taken twice weekly throughout the course of the experiment. The quality of the data was significantly higher than that used in previous HIV parameter identification studies (we had an average of 98 viral load measurements per patient, compared to 9 in the best previous study). That allowed us to generate a reliable posterior distribution for the unknown parameter values for each patient, which we used as the prior distributions for sensitivity analysis of our proposed treatment methods. The high quality of the data also allowed us to publish the first direct estimates of antiviral drug efficacy in vivo, and to estimate the contribution of the latent reservoirs during treatment (Luo et al. PLoS One 2012).
To formulate a likelihood function for the model fitting, we developed a novel theoretical model for the uncertainty present in quantitative PCR measurements of HIV viral load. We fit this model to experimental data, and showed that there was a well-defined, density-dependent log-normal measurement uncertainty that grew significantly for low copy numbers. When we applied this model to a recently published variant assay with single-copy sensitivity, we were able to show that the uncertainty of the new measurement spanned its dynamic range. This showed that the new assay was being frequently misused as a quantitative assay in the literature (Luo et al. J. Clinical Microbiology 2012).
We found that the methods for reducing the risk of resistance developing following a switch of therapy described in (Luo et al. PLoS One 2011) were very sensitive to the inter-patient parameter variation identified from patient data in (Luo et al. PLoS One 2012). Specifically, the variation in the time when a viral load minimum was reached following our intervention could vary over months, and the rebound following the minimum could result in an order-of-magnitude loss of effectiveness in only days. A closed-loop strategy was clearly necessary to accurately detect the induced viral load minimum following our treatment intervention. A naive strategy would be to sample viral load every three days following our intervention until a local minimum was detected, but this strategy would result in an unacceptably high number of viral load measurements, which are costly and invasive. Instead, we developed a closed-loop, recursive minimum prediction method which is able to accurately detect the viral load minimum using a fraction of the viral load measurements necessary for the naive method (Cardozo et al. IEEE Trans Biomedical Engineering 2012).
The estimates of residual virus production obtained from patient data in (Luo et al. PLoS One 2012) were too large to be explained solely through the activation of latently infected quiescent T-Cells. We explored the relative contribution of quiescent cell activation and ongoing replication to the residual viral load through the analysis of time-series measurements of 2-LTR circular HIV-1 DNA artifacts following raltegravir intensification in well-controlled patients. Raltegravir is an integrase-inhibitor, which means that it stops the HIV infection process after a DNA copy of the virus has been made, but before it can integrate into the target cell’s chromosomes. This floating piece of HIV DNA has a high probability of being “repaired” by the host cell’s DNA repair enzymes. This results in the formation of a circle of DNA, called a 2-LTR circle, which is no longer capable of infection.
We modeled the dynamics of 2-LTR formation for this experiment at two levels. A simple, reduced model was introduced in (Luo et al. J Roy Soc Interface 2013), allowing us to identify key dynamic parameters from the relatively sparse experimental data. A full spatial model including transport dynamics into lymphoid tissues is currently under review at PLoS One. Both models lead to the same conclusion: the observed 2-LTR dynamics in the 13 patients from the clinical trial could only have been produced if those patients had local regions in their body where the HIV infection was not controlled by the antiviral drugs. Our model also allows us to make an estimate of how much virus replication is occurring in the patient. For the patients in the study, these estimates ranged from 106-108 new infected cells per day. For comparison, turnover rates in patients not taking antiviral drugs are 109-1010 infected cells per day.
This finding has very important implications for HIV treatment. As mentioned previously, running out of available therapies is responsible for an average reduction in life expectancy of as high as 15 years. Regions of uncontrolled replication during apparently effective therapy provide a mechanism for ongoing replication of the virus under selective pressure toward drug resistance, making the development of the resistance inevitable. Furthermore, before our study was published, there was no way to detect or monitor this cryptic HIV viremia. In addition to the published and submitted scientific papers on this topic, we have also applied for a patent on the method of detection and quantification of cryptic viremia described by the research.
Censored Data in Tracking Applications
Censored data is ubiquitous in biological applications, being revealed most frequently as lower-limit of detection on particular measurements. Censoring also appears in several tracking and estimation applications. When attempting guidance using low-cost sensors, measurement saturation becomes a significant problem, which is a special case of censoring. We have explored practical solutions to this problem for magnetic field measurement in guided ballistics in our recent publication (Allik et al. IEEE Trans Aero Elec Sys 2013). Our research in this area is funded by an ongoing grant from the Army Research Labs.
Another application where censoring is important is in visual tracking. When the object being tracked either moves out of the camera frame or behind another occluding object, our measurement of that object’s position is censored in the same sense as the biological measurements below the lower limit of detection. If we have a physics-based model of the objects expected motion, we can continue to track the object’s maximum-likelihood position as well as the uncertainty associated with the estimate, using the information contained in both the uncensored and censored observations.
Many applications of visual tracking require real-time estimation. Real-time computation requires that the estimator be defined recursively, so that the memory and computational cost required do not increase with the running time. To facilitate this, we developed an adaptation of the Kalman filter for Tobit censored applications. This is a truly recursive formation, requiring only one time-step of memory, and has computational requirements comparable to that of the standard Kalman filter. Unlike the Kalman filter, the Tobit Kalman filter provides a bias-free estimate of the latent variable even when a significant fraction of the measurements are censored. We are exploring applications of this recursive estimator to various visual tracking and surveillance applications.