d4T dose reduction does not result in poorer treatment outcomes in South African patients
By Keith Alcorn
The use of a reduced dose of d4T (stavudine), as recommended by the World Health Organization, did not reduce the likelihood of viral load suppression after six months in a large cohort of South African patients, researchers from the Aurum Institute of Health Research report in the August 24th edition of AIDS.
The use of d4T in first-line treatment has been phased out in the developed world due to toxicity, but in developing countries fixed-dose combinations containing d4T are still widely used due to the drug’s low cost compared to tenofovir or abacavir. In South Africa d4T remains a staple component of first-line treatment in the public health system, despite widespread calls for its use to be phased out.
The key toxicities associated with d4T are lipoatrophy (fat loss), hyperlactatemia and lactic acidosis, and peripheral neuropathy.
The drug was originally licensed at a dose of 40mg twice daily in adults weighing more than 60kg.
In 2007 the World Health Organization recommended that developing country treatment programmes should use a 30mg dose of d4T if it was not possible to phase out use of the drug. Their recommendation was based on a number of small studies which showed no negative effect of using a lower dose in adults weighing more than 60kg.
Adoption of the 30mg dose has been slow in some national programmes, and data are still lacking from an African population on the virological effects of initiating therapy with a lower dose of d4T.
Researchers at the Aurum Institute in Johannesburg analysed data from 618 patients enrolled in community-based HIV care programmes in South Africa who initiated treatment containing d4T between January 2006 and January 2008. All patients had been followed for at least six months after starting treatment, and weighed at least 60kg at baseline.
Of the eligible patients, 110 received a 30mg dose and 508 received a 40mg dose. Those receiving a 30mg dose were slightly more likely to receive nevirapine than efavirenz and to have WHO stage 4 HIV disease, and had significantly lower baseline CD4 counts (91 vs 115, p=0.0001). These differences were not a result of individualisation of treatment, say the investigators, but due to a change in guidelines during the period under study.
There was no significant difference after six months of treatment in the proportion of patients who had viral load below 400 copies/ml or 50 copies/ml (79% vs 81% and 60% vs 58% respectively). Multivariate analysis which adjusted for NNRTI agent, baseline viral load and weight showed no effect of dose on viral suppression.
The investigators say their findings provide additional evidence to support the WHO recommendation, but note that evaluation of long-term side effects according to dose is essential.
Reference
Hoffmann CJ et al. HIV suppression with stavudine 30mg versus 40mg in adults over 60kg on antiretroviral therapy in South Africa. AIDS 23 (13): 1784-1786, 2009.
Clinical officers and nurses make similar decisions to physicians on starting antiretroviral therapy in rural Uganda
By Roger Pebody and Carole Leach-Lemens
An analysis of decision-making by non-specialist physicians, clinical officers and nurses in Uganda has found that there is a high level of agreement in decisions on starting antiretroviral therapy.
The study, in the journal Human Resources for Health, contributes to the evidence base in support of task-shifting, and may ease concerns about a decrease in quality compared to current standards of care. It lends support to arguments for increased investment in training of nurses and clinical officers for the delivery of antiretroviral therapy in rural and semi-rural settings.
Rapid scale up of antiretroviral therapy has brought to light weaknesses in the health systems of developing countries. WHO estimates that over four million health workers are needed, and this shortage of medical doctors and other health workers trained to deliver HIV treatment and care has been identified as the most serious barrier to the sustained scale up of ART in resource-poor settings.
Task-shifting involves the delegation of healthcare tasks from more highly trained individuals to those with less training, and it has been increasingly employed as a way to help address the shortage of highly trained staff in many resource-limited settings.
Few countries in the developing world legally mandate non-physician clinicians to provide HIV treatment and care (exceptions include Malawi, Kenya, Ethiopia and Uganda). There is a reluctance to legalise task-shifting due to concerns about the capacity of the workers, as well as the quality of care provided.
Conversely this is a common and very successful practice in the developed world where non-physician clinicians (in particular nurse practitioners) play a central role in the treatment and care of HIV-positive patients.
Moreover the Integrated Management of Adolescent and Illnesses (IMAI) guidance, developed by the World Health Organization and partners, uses task-shifting to support a public health approach to treating HIV within government health systems. The target audience is principally healthcare workers providing clinical care at front-line health facilities. More than 30 countries, mostly in sub-Saharan Africa are currently using IMAI.
Ashwin Vasan and colleagues designed a study in which a non-physician first assessed the patient and made a recommendation on whether to start treatment, after which a physician repeated the procedure (without being aware of the initial recommendation). The study compares each worker's assessment, although all final clinical decisions were made by the physician.
Study sites were twelve district hospitals and subdistrict primary care clinics in Uganda. Sites were chosen based on their large HIV programmes, and staff were trained in the management of HIV through the Ministry of Health programme (an adaptation of generic WHO/IMAI protocols). Rural or semi rural sites were chosen to provide an accurate picture of decentralised ART care. Viral load testing was not available and access to CD4 cell counts was limited at all sites.
Healthcare workers were classified as:
- Physicians: those who had completed a six year medical school programme plus a one-year internship. Physicians were not usually HIV specialists.
- Clinical officers: those who had completed three years' pre-service education plus two years' internship.
- Nurses, who had completed between one and four years' formal nursing education.
The authors acknowledge that the variation in the level of training of Ugandan nurses is a limitation of the study. Moreover, the study did not record the number of years of work experience that each worker had.
The researchers used Kappa analysis to evaluate levels of agreement. Kappa is a statistical measurement of the degree of agreement between two observers, and which rates strength of agreement as ‘slight’ (0 to 0.2), ‘fair’ (0.2 to 0.4), ‘moderate’ (0.4 to 0.6), ‘substantial’ (0.6 to 0.8) or ‘almost perfect’ (0.8 to 1).
The 521 eligible patients were HIV-positive adults not currently on treatment.
Each patient was first assessed by either a clinical officer or a nurse, and then by a physician. The primary outcome in the study was the final recommendation on whether or not to start antiretroviral therapy.
In the clinical officer arm, agreement with physicians on starting therapy was almost perfect (Kappa 0.91), with 95% of decisions being the same (as opposed to 43% agreement which could be expected by chance alone).
Agreement between nurses and physicians was less consistent and was calculated to be at the top end of the moderate category (Kappa 0.59). A total of 78% of decisions were in agreement, as opposed to 46% which could be expected by chance alone.
Moreover, both nurses and clinical officers were both somewhat more likely than physicians to recommend an alternative to the standard d4T/3TC/nevirapine regime.
Nurses’ assessments of patients’ WHO clinical stage and TB status were both in substantial agreement with those of physicians. On the other hand, the assessments of clinical officers for both these items were only in moderate agreement with the doctors.
On a number of items, the agreement of both nurses and clinical officers with physicians was assessed as ‘fair’. This was the case for functional status (able to work, able to walk, etc.), opportunistic infection status, absolute exceptions to starting antiretroviral therapy immediately, and patient readiness to start therapy. However the authors note that a number of these variables are subjective in nature. Moreover in some cases there was a great deal of missing data.
Although nurses had higher agreement scores on some items than clinical officers, the authors focus their conclusions on the latter group because of the decisions concerning initiation of therapy. They say that there is “compelling evidence” that clinical officers should be allowed to initiate therapy.
The authors believe that their findings show that under routine conditions at rural health facilities, and without any targeted increase in training or supervision of healthcare workers beyond the national framework, there is agreement in the clinical judgement between different cadres of healthcare workers in terms of starting antiretroviral therapy.
Nonetheless, they also describe the study as a pilot, and argue that “these preliminary data warrant more detailed and multicountry investigation”.
Reference
Vasan A et al. Agreement between physicians and non-physician clinicians in starting antiretroviral therapy in rural Uganda. Human Resources for Health 7:75, 2009.
Uganda survey shows major ART training gaps for non-physicians
By Carole Leach-Lemens
A survey of health facilities providing antiretroviral treatment in Uganda has found that nearly two-thirds of those providing ART are not doctors, and report major gaps in training. Two out of every five of this group had received no training in starting patients on ART and two-thirds had not been trained in how to monitor patients on ART.
The findings were published in the August 23 edition of Human Resources for Health.
In self-assessment questionnaires seven percent of doctors, 42% of clinical officers, 35% of nurses and 77% of midwives thought their overall knowledge of ART was lower than “good”.
Task-shifting from physicians to nurses and clinical officers requires ongoing integrated trainings to ensure the correct use and monitoring of ART if toxicity and drug resistance are to be avoided and the success achieved to date in the management of HIV is to be maintained in resource-poor settings.
Access to ART continues to expand beyond urban centres into remote areas and task-shifting is widely acknowledged as a means to counter the challenge that the chronic shortage of healthcare personnel in resource-poor settings presents. Studies have demonstrated that in some cirumstances the quality of care provided by non-physician clinicians is equal to or better than that provided by clinicians.
The Infectious Diseases Institute (IDI) at Makerere University, Uganda together with the (Ugandan) Ministry of Health undertook a training needs assessment that focused on two of the World Health Organization’s recommendations for task-shifting in the promotion of access to HIV and other health care services:
- Recommendation Four: countries undertake or update a human resources analysis on the extent to which task-shifting is already taking place and
- Recommendation Nine: countries adopt a systematic approach to harmonised, standardised and competence-based training that is needs-driven and accredited.
A survey of health professionals and heads of antiretroviral therapy clinics from a stratified random sample of 44 of the country’s 205 accredited health facilities was undertaken. Six out of 12 catchment areas were chosen by a lottery method. The sample included six regional referral hospitals, 16 district hospitals and 22 health centres. Facilities were grouped as follows: ownership (government or non-governmental organisation and/or faith-based) and whether antiretroviral therapy was being provided.
A sample of health professionals was chosen in collaboration with the head of the ART clinic. Criteria included being present on the day the study team visited. Efforts were made to have at least one doctor, one clinical officer, one nurse and one midwife from each clinic.
Data collection involved self-administered questionnaires for individual health professionals and face-to-face interviews with the heads of the antiretroviral clinics.
Forty-three of the 44 facilities selected were included of which 38 provided ART and five (one district hospital and four health centres) did not.
Expansion of ART from urban clinics to district hospitals and primary care facilities is reflected in the numbers. Although regional referral hospitals provided ART to a higher proportion of people with HIV (45%) than district (33%) and health centres (17%) the authors suggest that over time these percentages may even out as care is transferred closer to accredited facilities near the patient’s home.
The sample comprised 265 clinicians: 34 doctors, 46 clinical officers, 124 nurses and 61 midwives. This distribution across professions was markedly different to the distribution of staff at ART clinics. Doctors were under-represented at all facilities, whereas nurses were over-represented at health centres and underrepresented at regional referral and district facilities. ART clinics at two district hospitals and two health centres had no doctors on staff.
ART tasks were performed by all staff interviewed. 64 percent of clinicians prescribing ART were clinical officers, nurses and midwives. The authors note that this task-shifting was in line with recommendations from experts and may well have contributed to increased access to ART.
The study revealed that training on starting and monitoring ART has not kept pace with task-shifting. Of those prescribing ART 35% had not been trained on starting ART and 49% had not been trained on the monitoring of ART. These percentages differed across health professions: 27% of doctors had no training on monitoring ART compared to 64% of other clinicians. Similarly 24% of doctors had no training on starting ART compared to 38% of clinical officers, 38% of nurses and 49% of midwives.
While higher percentages of doctors and clinical officers attended training on monitoring of ART and paediatric HIV care than nurses and midwives, a lower percentage of doctors and clinical officers attended training on voluntary counselling and testing than nurses and midwives.
Self-assessment of knowledge of ART also differed across professions and was closely related to training in starting and monitoring ART: Ratings were categorized as “excellent”, “very good” and “good” and were grouped together as “sufficient”. 75% of all respondents deemed their overall knowledge of HIV as sufficient and 40% rated their overall knowledge of ART as sufficient. 7% of doctors prescribing ART rated knowledge less than “good” compared to 48% of other clinicians.
Limitations noted by the authors include overrepresentation of certain professionals at some facilities and underrepresentation of others due to reliance on those present at accredited ART clinics on the day of the study.
The authors further note that of the 45 facilities in the sample two remote facilities were replaced by those easier to reach. Task-shifting and an absence of training, the authors believe, were more likely to occur in remote facilities so the sample may have underestimated the extent of task-shifting in addition to the associated ART training needs.
The authors suggest that this assessment provides an innovative method that can be replicated to inform ART trainings in the context of ongoing scale up and task-shifting. The authors conclude that “Training initiatives should be an integral part of the support for task-shifting and ensure that ART is used correctly and toxicity or drug resistance does not reverse the successes to date.”
Reference
Lutalo IM et al. Training needs assessment for clinicians at antiretroviral therapy clinics: evidence from a national survey in Uganda. Human Resources for Health 7:76, 2009.
Earlier treatment could be cost-effective for South Africa
By Keith Alcorn
Starting treatment at a CD4 count of 350 would be cost-effective in South Africa, with a maximum extra expenditure of $1.5 billion over five years if everyone eligible was treated, according to findings from a collaborative study by American and South African researchers published in the August 4th edition of the Annals of Internal Medicine.
South Africa’s Department of Health is currently considering a recommendation from the South African National AIDS Council (SANAC) to revise national guidelines so that people diagnosed with HIV infection can receive treatment once their CD4 counts fall below 350 cells/mm3.
This would bring the South African public sector into line with recommendations in the United States and Europe, and align the public sector with the recommendations of the South African HIV Clinicians Society, which are already followed by most private sector providers.
Earlier treatment has the potential to reduce deaths and illness, but only if individuals with HIV infection are tested and then initiated on treatment before experiencing a serious CD4 cell decline.
At a symposium during the International AIDS Society conference in Cape Town last month Dr Francois Venter, President of the South African HIV Clinicians Society, warned that South Africa was still failing to initiate treatment promptly in people diagnosed with very low CD4 counts. He said that the health system needed to address this issue urgently, and that current debates about earlier treatment were simply ignoring this question.
In their analysis, Rochelle Walensky and colleagues in the Cost-Effectiveness of Preventing AIDS Complications (CEPAC) International group used a computer-based model of HIV disease progression and 2006 South African health system costs, together with WHO estimates of the number of people infected with HIV, in order to assess the potential increase in treatment costs and gains in life expectancy if treatment guidelines were to recommend earlier HIV treatment for South Africans.
The full methodology of the study can be reviewed in the published journal article, which is freely available online.
Patients in the hypothetical cohort were assumed to have a baseline CD4 count of 375, and their disease course over the next five years was projected, together with the costs of providing care to those who either became eligible for treatment under current South African guidelines, or who required medical care as a result of falling ill due to HIV infection, or who died.
The study assumed that in people who received treatment according to different potential guidelines, 84% would achieve a viral load below 50 copies/ml after 48 weeks, with a mean CD4 cell increase of 184 cells within the same period. Of those who failed first-line treatment, 71% would be assumed to achieve undetectable viral load. These estimates were based on data from South African cohorts already receiving treatment.
They found that over a five-year period 4.7 million people in South Africa would become eligible for treatment if the CD4 count threshold was raised to 350, of whom 1.2 million are already judged to be eligible, with a further 1.6 million becoming eligible within one year. In years three, four and five, another 1.9 million people would become eligible, indicating that the impact of changing the guidelines could be felt very quickly by the health system.
However, not all these patients are likely to be identified. The researchers analysed the impact of three different levels of diagnosis and linkage to care. If 10% of people eligible for treatment under new guidelines actually started treatment the effect on deaths and illness would be modest (just 25,000 fewer deaths over a five-year period), and taking into account the reduced cost of treating opportunistic infections, health spending would rise by $142 million over five years.
If fifty per cent of those eligible were diagnosed and started treatment, around 600,000 deaths would be averted (a one-third reduction), while health spending would rise by a net $1.1 billion over five years.
One hundred per cent success in diagnosing and treating everyone eligible would avert around 1.5 million deaths and lead to additional health expenditure of $1.5 billion over five years, even taking into account the money saved through illness averted.
Current debates over earlier treatment in resource-limited settings also focus on the long-term financial sustainability of treating larger numbers of patients. Cost-effectiveness analysis can also demonstrate how changes in life expectancy and health service utilisation in people who receive treatment translate into long-term costs, and the extent to which the investment is affordable within a country’s own resources.
In the case of South Africa, the CEPAC International analysis shows that earlier treatment would be affordable within the framework endorsed by the World Health Organization for measuring the cost-effectiveness of new health interventions in developing countries. The incremental cost-effectiveness ratio is the ratio of the cost per life year saved by the intervention to GDP per capita. Where the ratio is less than 1, the investment represents very good value for money, and where it is less than 3, the investment is cost-effective.
If starting treatment at a CD4 count of 350 results in a superior outcome to treatment at 250, as a recent trial in Haiti has shown, earlier treatment is highly cost-effective for South Africa, with an incremental cost-effectiveness ratio of $1200 per life year saved compared to starting treatment at a CD4 count of 250 (South African GDP in 2006 was $5400).
However treatment advocates have criticised current levels of funding in South Africa, and say they are inadequate to meet current needs, let alone expanded demand for treatment.
Reference
Walensky RP et al. When to start antiretroviral therapy in resource-limited settings. Ann Int Med 151: 157-166, 2009.
IRIS responsible for low number of deaths in patients starting HIV treatment in Uganda
By Michael Carter
Immune reconstitution inflammatory syndrome (IRIS) is not a significant cause of death in patients initiating antiretroviral therapy in Uganda, investigators report in the September 15th edition of Clinical Infectious Diseases, now available online. In a retrospective study, the investigators found that although 17% of patients died within the first three years of starting HIV treatment, only a small proportion of these deaths were attributable to IRIS.
"In our experience, the contribution of IRIS…to early mortality is limited," comment the investigators.
HIV treatment has been shown to be as effective in resource-limited settings as it is in developed countries. However, significant early mortality and loss to follow-up has been observed in patients starting antiretroviral therapy in a number of African countries. There has been little research into the conditions causing these deaths, nor has the contribution of IRIS to early mortality been well characterised.
After an individual starts HIV treatment, their immune system starts to improve and this can unmask sub-clinical infections, sometimes resulting in an inflammatory response that can cause unpleasant symptoms, illness, and – in rare cases – death.
Investigators from Kampala therefore undertook a retrospective study lasting three years that recorded mortality rates and the cause of death in 559 individuals who started HIV treatment for the first time between 2004 and 2005.
These patients had a median age of 38 years and 69% were women. At the time HIV treatment was started, the patients had advanced HIV disease, with 89% having an AIDS diagnosis, and the median CD4 cell count was 98 cells/mm3. A third of patients had a body mass index (BMI) of 18 kg/m2 or below and 36% had a haemoglobin level below 11 g/dl.
A total of 99 patients died during the three years of the study. As with other studies investigating deaths amongst patients starting HIV treatment in Africa, most deaths (80; 14% of the entire cohort) occurred during the year after HIV treatment was started (with 54, 73% of deaths, in the first three months). There was a significant fall in mortality thereafter (15 deaths in the second year and four in the third).
This provided a mortality rate of 17.9 per 100 person years in the first year after starting HIV treatment, 2.3 per 100 person years in the second year, and 1.2 per 100 person years in the third year.
Of the 80 deaths in the first year, 69 (86%) were HIV-related, with four being attributed to IRIS. All four cases were "unmasking events" including one case each of Cryptococcus meningitis, extra-pulmonary tuberculosis (TB), TB meningitis and a tumour in the brain.
Overall, TB and Cryptococcus were the most common causes of death, and almost two-third of the deaths attributed to TB occurred in individuals who had symptoms of the infection before initiating antiretroviral therapy.
In multivariate analysis that controlled for possible confounding factors, the baseline characteristics associated with both all-cause and HIV-related death were a CD4 cell count below 25 cells/mm3, a BMI below 18 kg/m2, and a haemoglobin level below 8 g/dl.
"We observed a high mortality rate of 14% during the first year of therapy, particularly during the first three months," write the investigators. They note that many of these deaths were due to infections and "may have been preventable if the infrastructure for opportunistic infection screening was routinely available and patients were given prophylaxis treatment prior to antiretroviral therapy initiation".
They emphasise that few deaths could be attributed to IRIS, supporting "the view that, in most cases, IRIS is a self-limited clinical entity".
Reference
Castelnuovo B et al. Cause-specific mortality and the contribution of immune reconstitution inflammatory syndrome in the first 3 years after antiretroviral therapy initiation in an urban African cohort. Clin Infect Dis 49 (online edition), 2009.
Further evidence of needless treatment switches in absence of viral load testing
By Keith Alcorn
Further evidence has emerged that a substantial proportion of switches to second-line treatment in a resource-limited setting, triggered in the absence of viral load testing, are unnecessary and result in an avoidable inflation in drug costs as people switch to more expensive regimens.
The findings, published in the August 1st edition of Clinical Infectious Diseases, are likely to lend further support to calls for viral load testing to be made more accessible in resource-limited settings to confirm cases of suspected treatment failure.
In well-resourced settings everyone receiving treatment undergoes regular viral load testing in order to detect viral rebound and the failure of treatment. Switches to new treatment take place if viral rebound is detected, since the existing regimen becomes ineffective - due to drug resistance - once viral rebound occurs.
In resource-limited settings, viral load testing is rarely available due to cost and lack of well-equipped laboratories.
Failure of first-line treatment can be detected only by monitoring the CD4 count for declines or looking for the development of clinical symptoms.
It had been widely assumed that CD4 counting would chiefly tend to result in delayed identification of large numbers of cases of viral rebound because of the time lag between viral rebound and subsequent loss of CD4 cells due to uncontrolled viral replication. It was feared that the major consequence would be that large numbers of patients would develop high-level resistance to some second-line drugs.
However research presented at the Conference on Retroviruses and Opportunistic Infections in February this year showed that treatment switches on the basis of CD4 counts were often unnecessary, because the patients often continued to have undetectable viral load despite a decline in CD4 count. The researchers who conducted the study, in Uganda, suggested that infections such as malaria could be causing temporary dips in CD4 count.
They also estimated that in a cohort of 125 patients who experienced CD4 declines, 107 would have been switched to more expensive second-line treatment, adding $75,000 in drug costs to the treatment programme’s budget.
Now, research from western Kenya has confirmed that the Ugandan observation is a common problem.
AMPATH, a service collaboration between Moi University and local clinics in the Eldoret region of western Kenya, carried out viral load tests on all patients receiving ART who had suspected immunologic signs of treatment failure (a CD4 cell decrease of at least 25% over the previous six months).
The retrospective study identified 149 patients who had suspected treatment failure. Of these 58% turned out still to have a viral load below 400 copies, and even among the subset of 42 who experienced a CD4 decline of more than 50% during the previous six months, 43% (18) still had a viral load below 400 copies, indicating that there was no need to switch treatment in those cases.
Among those with a CD4 cell count above 200 at the time of suspected treatment failure, two-thirds (66%) had a viral load below 400 copies, compared to 41% of those with a CD4 count below 100 cells/mm3
When misclassification was analysed according to CD4 cell percentage rather than absolute number it became clear that the highest risk of 'true' treatment failure occurred in those with a CD4 cell percentage below 10 (65% had viral load above 400 copies, compared to only 26% of those with a CD4 percentage between 20 and 29).
Logistic regression analysis showed that misclassification of treatment failure was more likely if the patient had a higher CD4 count, a shorter duration of treatment and a smaller decline in CD4 cell percentage.
“In our study, there was a high likelihood of failure if the patient had a CD4 cell count of <200 cells/uL and was on therapy for > 20 months; there was a low likelihood of failure of therapy if the patient had a CD4 count of <300 and >200 cells/uL and was on therapy for <12 months.”
At AMPATH clinics, viral load testing is now mandatory in all cases of suspected treatment failure, but, say the authors: “We recognize the fact that … selective virological monitoring may not be instantly achievable. These results suggest the need to reconsider recommendations on immunological monitoring in resource-limited settings.”
They suggest that use of CD4 percentages may improve the sensitivity of immunological monitoring for treatment failure, but say that their findings need to evaluated in other populations before generalised conclusions can be drawn.
They also note that a previous simulation study carried out by Professor Andrew Phillips, which found only modest benefit to viral load and CD4 monitoring when compared to clinical monitoring in resource-limited settings with regard to cost-effectiveness, was based on the assumption that misclassification of treatment failure occurred in no more than 19% of cases.
They note several limitations: the fact that they could not verify viral load and CD4 measures; an average delay of two months between CD4 count and viral load test; and a lack of information about seasonal variations in CD4 count or changes in CD4 count due to intercurrent illnesses such as malaria.
In an accompanying editorial, doctors from Kenya and South Africa say: “In 2008 Smith and Schooley referred to managing ART without viral load as “running with scissors”. The emerging data…suggest it is more akin to throwing these programs onto drawn swords.”
“The time has come to work towards the progressive introduction of appropriate viral load monitoring technology in these programs with the same sense of urgency and commitment as the world approached ART access. To do less is to abandon the early success of ART to global collapse.”
Reference
Kantor R et al. Misclassification of first-line antiretroviral treatment failure based on immunological monitoring of HIV infection in resource-limited settings. Clin Infect Dis 49: 454-462, 2009.
Sawe FE, McIntyre JA. Monitoring antiretroviral therapy in resource-limited settings: time to avoid costly outcomes. Clin Infect Dis 49: 463-464, 2009.
d4T toxicity risk seven-fold higher when started with TB treatment
By Keith Alcorn
In patients receiving treatment for TB, initiating antiretroviral treatment that contains d4T results in a greatly increased risk of d4T-related toxicity during the first two months of antiretroviral treatment, US and South African researchers report. They say that all TB patients should be screened for pre-existing neuropathy before starting antiretroviral treatment, and that alternatives to d4T should be considered for antiretroviral treatment (ART) alongside TB treatment.
The findings are published in the June 1st edition of the journal Clinical Infectious Diseases by researchers from the University of North Carolina and the Clinical HIV Research Unit of the University of the Witwatersrand, Johannesburg.
First-line antiretroviral treatment regimens containing d4T (stavudine) remain the norm in many developing countries due to the low cost of fixed-dose combinations containing the drug.
However d4T has a number of serious potential toxicities including lactic acidosis, peripheral neuropathy and lipoatrophy. Lactic acidosis is life-threatening if early symptoms are not recognised, and nerve damage caused by the drug may exacerbate existing neuropathy caused by HIV, leading to irreversible, extremely painful, harm to the nerves in the feet and legs.
For many patients receiving TB treatment there is a need to initiate antiretroviral treatment either concurrently or after TB treatment has begun, due to a low CD4 count and the consequent high risk of further opportunistic illnesses. In South Africa at least 20% of people with HIV who start a course of TB treatment need to initiate antiretroviral treatment at the same time as TB treatment.
But TB treatment contains isoniazid, a drug also known to cause peripheral neuropathy.
This study was designed to examine the extent to which d4T use alongside TB treatment results in termination of d4T use and substitution with another drug, the most concrete marker of unacceptable toxicity available for measurement.
The researchers reviewed data on 7066 patients who initiated d4T-containing ART at the Themba Lethu clinic in Johannesburg between April 2004 and March 2007. Of this group, 1845 patients received treatment for active TB that coincided with the period of d4T treatment: 1272 were already receiving TB treatment when they began ART, 224 commenced TB treatment at the same time as ART and 349 required TB treatment after beginning ART.
Of all patients who initiated ART, regardless of TB history, 3.7% died, 17.7% were lost to follow-up and 17.3% (1219) switched from d4T to another drug (of whom 69% changed only d4T). The overall rate of substitution was 12.1 per 100 person years of follow-up, with the rate of substitution increasing with the duration of d4T treatment from 7.9 per 100 person years in months zero to six, to 18.1 per 100 person years beyond year one.
The analysis controlled for the effects of sex, age, previous history of peripheral neuropathy or TB, haemoglobin, body mass index, CD4 count, WHO disease stage and stavudine dosage on the risk of switching from d4T during TB treatment.
Among patients who received TB treatment, hazard ratios for d4T substitution compared to those who did not receive TB treatment were:
|
Adjusted hazard ratio of substituting d4T (95% confidence interval in brackets) |
Months 0-2 |
Months 3-6 |
After month 6 |
Already on TB treatment when d4T started |
3.18 (1.82 – 5.56) |
2.51 (1.77-3.54) |
1.19 (0.94-1.52) |
Started TB treatment and ART at the same time |
6.6 (3.03 – 14.37) |
1.88 (0.87- 4.09) |
1.07 (0.65 – 1.76) |
Started TB treatment more than 2 months after ART |
0.99 (0.46 – 2.12) |
1.18 (0.61 – 2.25) |
0.87 (0.28 – 2.73) |
Forty-three per cent of single-drug d4T substitutions were due to peripheral neuropathy, 24% to lipodystrophy and 20% to lactic acidosis or symptomatic hyperlactataemia. The dose of d4T received did not appear to affect the risk of substitution. Patients who received TB treatment were more likely to switch due to peripheral neuropathy.
“Our results show that initiation of TB treatment and stavudine-based HAART within a 2-week window puts patients at a nearly 7-fold increased risk of stavudine substitution in the first 2 months of HAART,” the authors note.
They suggest that their results may, if anything, understate the frequency with which concomitant d4T and TB treatment is leading to serious peripheral neuropathy, due to the fact that South African patients receive vitamin B6 alongside isoniazid to reduce the risk of peripheral neuropathy. In addition, patients with peripheral neuropathy often receive amtriptyline to manage neuropathic pain.
Without these interventions, severe peripheral neuropathy could lead even more frequently to switches from d4T, they say.
The study was unable to assess the extent to which milder peripheral neuropathy is emerging as a result of concomitant TB and d4T treatment.
In South Africa at least 20% of people with HIV who start a course of TB treatment need to initiate antiretroviral treatment while taking isoniazid, highlighting the public health relevance of these findings. The authors say that screening for peripheral neuropathy is important in all patients receiving TB treatment and d4T, especially those who start TB treatment and d4T within a short time of each other. In addition, “where additional antiretroviral drugs are available,” they say, ”we may wish to reconsider the use of stavudine in first-line HAART for patients with ongoing or concurrent initiation of TB treatment.”
Further information
NAM’s electronic newsletter HIV & AIDS Treatment in Practice recently published an extensive clinical review on the management of peripheral neuropathy in people with HIV in resource-limited settings. Download the pdf here.
Reference
Westreich D et al. Tuberculosis treatment and risk of stavudine substitution in first-line antiretroviral therapy. Clin Infect Dis 48:
HIV treatment should be started earlier in resource-limited settings, shows trial
By Michael Carter
A major clinical trial has shown that HIV-positive patients in resource limited countries are more likely to survive and experience less HIV disease progression if they start taking antiretroviral therapy when their CD4 cell count is between 350 and 200 cells/mm3, rather than waiting until their CD4 cell count falls to below 200 cells/mm3.
Interim analysis of the CIPRA HT 001 study, a randomised control trial being conducted in Haiti, showed that patients who started HIV treatment before their immune system had been severely damaged had better outcomes.
A total of 816 adults with CD4 cell counts between 300 and 250 were recruited to the study. Half were randomised to start HIV treatment immediately, the others waiting until their immune systems had weakened further or they developed an AIDS-defining illness. The combination of antiretroviral drugs used in the study was efavirenz, 3TC and AZT.
The study’s independent safety and monitoring board recommended that the trial should be stopped early after a planned interim analysis showed that only six patients who started antiretroviral therapy when their CD4 cell count was between 350 – 200 cells/mm3 died. (Patient recruitment began in 2005 but the median duration of follow-up is unspecified in the NIAID press release).
This compared to 23 deaths amongst individuals who started treatment later. Furthermore, twice as many individuals in the deferred treatment arm of the study developed tuberculosis, an AIDS-defining illness, than did patients who started HIV treatment before their immune systems were severely damaged by the virus.
In the light of these findings, the study’s monitoring board recommended that it should be terminated immediately and that all individuals in the deferred treatment arm should be offered anti-HIV drugs.
HIV treatment guidelines in industrialised countries, such as the UK, recommend that HIV treatment should be started when an individual’s CD4 cell count is around 350 cells/mm3. Starting treatment at this time has been shown to reduce the risk of developing both HIV-related and non-HIV-related illnesses.
However, in resource limited settings, where only 30% of eligible individuals are currently taking anti-HIV drugs, treatment is not started until a patient’s CD4 cell count falls below 200 cells/mm3 or they develop AIDS. HIV treatment is often started by individuals when their immune systems are severely weakened, meaning that in many cases they die before they have the opportunity to benefit from HIV treatment.
“The public health community now has evidence from a randomized, controlled clinical trial – the gold standard – that starting antiretroviral therapy between 200 and 350 cells/mm3 in resource-limited settings yields better health outcomes than deferring treatment”, said Dr Anthony Fauci, director of the US National Institute of Allergy and Infectious Diseases, who sponsored the study.
High prevalence of lipodystropy amongst patients taking HIV treatment in Senegal: d4T sole risk factor
By Michael Carter
Approximately a third of patients starting antiretroviral treatment in Senegal developed moderate to severe lipodystrophy, with the use of d4T being the sole risk factor, investigators report in a study published in the online edition of the Journal of Acquired Immune Deficiency Syndromes.
Lipodystrophy, a disturbance in the way the body stores and processes fat, was first widely described as a potential side-effect of antiretroviral therapy in 1998. Its prevalence and possible causes have been well-described in industrialised countries, but there is little information about lipodystrophy in resource-limited settings.
Investigators from Senegal undertook a study to describe the prevalence and causes of body fat changes in people with HIV, and to assess the impact of HIV treatment on lipids. Their study was case controlled, with 189 patients treated with anti-HIV drugs for between four and nine years matched with an HIV-negative individual of the same age and sex.
Changes in subcutaneous fat were assessed by the patients’ physicians, and blood samples were taken to measure the levels of blood fats. Measurements of skin folds were also taken, as were waist circumferences and waist-to-hip ratio.
The patients had a mean age of 42 years and 60% were women. The mean duration of HIV treatment was approximately five and a half years.
Earlier research has shown that the drugs most associated with fat loss (lipoatrophy) are d4T (stavudine, Zerit) and AZT (zidovudine, Retrovir). The HIV treatment of 83 patients in the Senegalese study involved AZT, with 63 individuals taking therapy that included d4T.
Moderate-to-severe lipodystrophy was diagnosed in 31% of patients. This included 13% of patients who had fat loss, 15% with fat gain, and 3% with a combination of both. Women were more likely than men to experience body fat changes after starting HIV treatment (34% vs 26%), but the difference was not statistically significant.
However, when the investigators employed a broader definition of lipodystrophy, they found that almost two-thirds of patients (65%) had developed body fat changes to same extent, ranging from mild to severe. This included 18% with fat loss, 36% with fat gain, and 12% with both.
Lipodystrophy was also assessed using body measurements. Using this method, 50% of patients were found to have developed body fat changes after starting HIV treatment.
The investigators also found that HIV-positive patients weighed significantly less than the controls (mean 64kg vs 72kg, p < 0.001), and had a significantly lower body mass index (mean 22.5kg/m2 vs 25.7kg/m2, p < 0.001). Hip circumference was also lower in HIV-positive individuals (95cm vs 101cm, p < 0.001). Furthermore, mid-upper arm circumference was significantly lower in patients with HIV (28cm vs 31cm, p < 0.001), as were skin fold measurements at all the assessed body sites (p < 0.001).
Unsurprisingly, metabolic abnormalities were more common amongst the HIV-positive patients than the controls. Levels of fasting glucose (mean 4.86 mmol/l vs 3.60 mmol/l, p < 0.0001) were higher in the individuals taking HIV treatment, as were triglycerides (mean 1.03 mmol/l vs 0.79 mmol/l, p = 0.007). Furthermore, HDL cholesterol was lower in the patients with HIV (mean 1.33 mmol/l vs 1.22mmol/l, p = 0.008).
Next the investigators compared the metabolic profiles of the HIV-positive patients with and without body fat changes. This showed that patients with such changes had significantly higher triglycerides (p = 0.0005), total cholesterol (p = 0.007) and LDL-cholesterol (p = 0.04).
Finally, the investigators conducted analyses to identify possible risk factors for the development of lipodystrophy. The only significant risk factor revealed by these was treatment with d4T (p < 0.01).
After a year of d4T treatment, 22% of patients had developed body fat changes, increasing to 40% after three years and 50% after five years of treatment with the drug.
“Thirty-one percent of subjects had moderate-severe lipodystrophy, and the only independent significant risk factor was stavudine therapy”, comment the investigators.
Reference
Mercier S et al. Lipodystrophy and metabolic disorders in HIV-1-infected adults on 4-to 9-year antiretroviral therapy in Senegal: A case-control study. J. Acquir Immune Defic Syndr (online edition), 2009.
Weight gain predictive of survival in Cambodians and Kenyans taking antiretroviral therapy
By Kelly Safreed-Harmon
A study published in the April 27th issue of AIDS indicates that weight gain may be a reliable predictor of survival in underweight men and women starting antiretroviral therapy. The finding has broad implications because resource limitations in many developing countries preclude the use of laboratory monitoring to assess treatment effectiveness. If healthcare providers have some other means of identifying which patients are responding poorly to antiretroviral regimens, they may be able to intervene before those patients become dangerously ill.
The cohort study analysed mortality rates six and twelve months after the initiation of antiretroviral therapy. Study participants were being treated in Médecins Sans Frontières (MSF) programmes in Phnom Penh, Cambodia and Homa Bay, Kenya. The most striking finding was that people who had an initial body mass index (BMI) score of 18.5 kg/m2 or less and experienced weight gains of 10% or less during the first three months of antiretroviral therapy were far more likely to die within the next three months than people who had comparable initial BMI scores but experienced greater weight gains.
The study population was comprised of 2451 Cambodian adults and 2618 Kenyan adults. MSF followed World Health Organization (WHO) recommendations for initiating antiretroviral therapy: people offered antiretrovirals had either a WHO stage 4 condition, a WHO stage 3 condition with a CD4 cell count of less than 350 cells/mm3, or a CD4 cell count of less than 200 cells/mm3.
All antiretroviral regimens consisted of two nucleoside reverse transcriptase inhibitors (NRTIs) and one non-nucleoside reverse transcriptase inhibitor (NNRTI). In Cambodia, 51% of study participants received 3TC (lamivudine, Epivir), d4T (stavudine, Zerit), and nevirapine (Viramune), while 47% received 3TC, d4T, and efavirenz (Sustiva). In Kenya, 86% of people received the nevirapine-containing regimen, and 9% received the efavirenz-containing regimen.
The study evaluated the prognostic value of weight gain using four categories of initial BMI scores: ≤17 kg/m2; >17 to ≤18.5 kg/m2; >18.5 to ≤20 kg/m2; and >20 kg/m2. Individuals in the first two categories are considered underweight by international standards.
Mortality was analysed in relation to three levels of BMI increase at three months and six months: ≤5%; >5% to ≤10%; and >10%. Weight gain was found to be predictive of survival for study participants with initial BMI scores in the lower two quartiles, i.e., those who were underweight. People with an initial BMI score of ≤18.5 kg/m2 and weight gain of ≤5% had a mortality rate ratio (MRR) of 6.3 when compared to those in the same initial BMI category with weight gain of >10% (95% confidence interval [CI], 3.0 – 13.1).
The MRR for people with an initial BMI score of ≤18.5 kg/m2 and weight gain of >5% to ≤10% was 3.4 when compared to those in the same initial BMI category with weight gain of >10% (95% CI, 1.4 – 8.3).
When the researchers compared the prognostic value of weight gain in men and women, they found no significant differences. Nor were there differences between Kenyans and Cambodians; between people who started antiretroviral therapy at different disease stages or CD4 count levels; or between people using different antiretroviral regimens. All of this indicates that tracking weight over time can be an effective strategy in a wide range of antiretroviral recipients.
Weight gain was not predictive of survival for people whose initial BMI score was higher than 18.5 kg/m2. This somewhat lessens the value of weight monitoring as a clinical management tool. However, given that many people in resource-limited settings do not begin treatment until relatively late in the course of HIV disease, low BMI scores at treatment initiation are not uncommon. Forty-six percent of Cambodian study participants and almost forty percent of Kenyan study participants had BMI scores of 18.5 kg/m2 or less.
“Weight gain can be of great use in resource-limited settings, especially when decentralization of HIV care is required and access to well-trained physicians is limited,” the authors conclude. They go on to note that three possible reasons for an HIV-positive person’s failure to put on weight after initiating antiretroviral therapy are poor medication adherence, opportunistic infections, and insufficient nutritional intake. They advise assessing adherence and providing adherence counselling as warranted.
The authors express particular concern about the importance of screening for tuberculosis (TB) in antiretroviral non-responders, noting, “The experience from MSF in five countries showed a high incidence of TB under [antiretroviral therapy], and TB remains a leading cause of death in resource-limited settings.”
The authors stress that identifying antiretroviral non-responders by tracking weight should only be regarded as an interim solution. “Our results should not be interpreted as advocacy for minimal care and monitoring of patients taking ART in developing countries,” they write. “CD4 cell count and viral load remain the gold standards for patient monitoring, and everything should be done to make these tests available in resource-limited settings.”
Reference
Madec Y et al. Weight gain at three months of antiretroviral therapy is strongly associated with survival: evidence from two developing countries. AIDS 23: 853–861, 2009.
ARV roll-out in Ethiopia has reduced adult AIDS deaths by 50% in capital
By John Owuor
The roll-out of antiretroviral therapy has led to a decline of about 50% in adult AIDS deaths in Ethiopia's capital, Addis Ababa, over a period of five years, according the findings of a study published in the February 20th edition of the journal AIDS.
The effectiveness of antiretroviral roll-out in sub-Saharan Africa has been widely reported as encouraging despite persistent concerns about universal access and adherence. However, there are still only limited data on its effects at a population level on deaths.
In Ethiopia, antiretroviral treatment was made freely available in public hospitals from October 2005. The investigators carried out the current study to find out what effect the availability of antiretroviral treatment had had on AIDS-related mortality.
The researchers used data from burial surveillance records and 'verbal autopsy' interviews. Burial surveillance was implemented in all Addis Ababa cemeteries in 2001 and records about 20,000 deaths per year. The surveillance is undertaken by cemetery clerks who receive regular training. They record a lay report of the cause of death as narrated by close relatives or friends of the deceased and other demographic details.
Verbal autopsies are post-mortem interviews conducted by researchers with close relatives or caretakers, about the signs and symptoms they witnessed during the terminal illness of the deceased. Causes of death described in the interviews were then confirmed through physicians’ review.
Two different physicians reviewed the verbal autopsies to assign cause of death to the described symptoms. Whenever the assigned cause of death by two physicians did not match, a third physician was used to review the verbal autopsy questionnaire. The data used in the current study were derived from 413 cases involving individuals aged 20 to 64 years. Physicians assigned causes of death, classified as either AIDS or non-AIDS deaths.
Epidemiological modelling was used by the investigators to determine mortality trends in the study population.
The investigators then compared projected deaths with observed numbers from burial surveillance.
To determine possible averted AIDS-related deaths, the investigators compared the estimated with the implied numbers of AIDS deaths in population projections. They estimated HIV prevalence using UNAIDS estimation and projections package (EPP 2007).
Results showed that the ratio of observed over projected deaths in adults peaked in 2001. However, between 2001 and 2005 the ratios dropped by about 11% (from 1.92 to 1.71) for women and 20% for men (from 1.80 to 1.44).This was a period when patients part-paid for treatment. The researchers attributed this average AIDS mortality drop of about 15% to treatment effect and noted that the drop was higher in men than women partly because of sex imbalances in access to healthcare financing.
The results further showed that between 2005 and 2007, there was a decline of about 25% for women (from 1.71 to 1.28) and 21% for men (from 1.44 to 1.13). The investigators attributed this drop of over 40% to free treatment, suggesting that treatment cost is an important factor in the decline of AIDS deaths.
To confirm whether the trends noted above resulted from reduced AIDS mortality, the investigators turned to the findings from the lay reports. They found a decline from 8467 deaths in 2001 to 4230 in 2007 (about 50%). The study further found that the decline was greater between 2005 and 2007 when treatment was free.
The researchers noted that the decline observed took place during a period when mortality was supposed to be very high. HIV infections in Ethiopia peaked in the late 1990s, demonstrating the impact of treatment. Assuming that the burial surveillance coverage was 85%, the scientists estimated a reduction of about 56% in AIDS deaths by 2007.
The 56% decline compared favourably with findings from São Paulo in Brazil, which reported a decline in AIDS deaths of about 65% between 1995 and 2002, noted the investigators. They further compared their findings to a New York study which showed a higher decline (63% in two years).
The researchers said that their findings demonstrate the effectiveness of treatment coverage on averting deaths in early phases of HIV care (long-term impact is not yet known). For Ethiopia and similar settings, the researchers said that the immediate worry is about short-term mortality because treatment coverage is still at its infancy (only 2% of adult patients are on second-line regimens in Addis Ababa).
The researchers also noted that their findings raise questions about whether and how the decline in mortality can be sustained, and whether improvements in access to antiretrovirals alone can achieve this goal. More proactive attempts to diagnose people earlier and initiate treatment earlier may be necessary in order to reduce death rates further, given the continued high risk of death during the first year of antiretroviral treatment in those who start treatment at lower CD4 cell counts in sub-Saharan Africa.
The investigators acknowledged that, even though their study was based on epidemiological models, model-based and observational estimates can be very different. The findings might have been limited further by shortcomings of burial surveillance such as under-reporting, said the scientists.
However, for adaptability to similar settings, the investigators noted that burial surveillance is logistically simple to implement because it uses existing structures and is usable in settings with no population-based data.
Reference
Reniers G et al. Steep decline in population-level AIDS mortality following the introduction of antiretroviral therapy in Addis Ababa, Ethiopia. AIDS, 23:511-518, 2009.