Subscribe to our RSS feed

Articles tagged with: MSM


Measure for measure

Monday, 27 October 2014 Written by // Bob Leahy - Editor Categories // Health, International , Living with HIV, Opinion Pieces, Bob Leahy

How do we define success in both fighting HIV and in supporting people who live with it? How do we measure success and can those tools we have at our disposal provide the kind of accountability we should expect? Bob Leahy reports

 Measure for measure

Question period

If words like accountability, surveillance. epidemiology. evaluation and the like don’t sound too exciting, you might be forgiven for thinking that they need be of no concern to you. But think of them as ways in which we measure success in the HIV world. So much of that relates to those of us living with HIV living that perhaps we should be interested in how they work and whether they work well.  So today we are measuring how we are measured. 

Question is what exactly is the measure of success in the AIDS world?

Is it about an AIDS Service Organization (ASO) that ranks highly in the eyes of its clients, one able to secure funding (or fundraising dollars) for programs and services that enable it to make a difference in the lives of the community members it serves?

Is it about an organization or province or country that is making good progress in bringing the epidemic to its knees by stemming new infections and caring for those infected?

Is at about policies which encourage high rates of HIV testing?

Is it about an individual’s ability to access health care and treatment that will bring his or her viral load to the undetectable level?

Is it about a health care system that has good rates of engagement and retention of HIV-positive patients?

Is it about our organizational and collective accountability, what targets we set for each other and whether we attain them?

The answer is that it’s likely all these and more. We are, after all, a community whose service providers have become increasingly professionalized, hierarchical and having more in common with corporate structures than ever before (more on that in another post). Given this shift, how professional is our approach to measuring success? Do we have adequate measuring tools that evaluate the things that matter? Do we also have a commitment to the kind of quality control mechanisms we see in other institutions? Just how accountable are we in fact?

It’s something we seldom hear about except, say, in the context of hearing our professional colleagues groaning at the sometimes onerous reporting mechanisms that currently exist in ASOs. It’s a common theme to hear, for instance, that staff are overwhelmed with paperwork, much of it reports. Thus our professional community includes growing numbers not engaged in HIV prevention or support, but in proposal writing, number crunching and filing endless reports. This is the face of accountability in 2014.

Few would argue we don’t need accountability, though, for there are big dollars involved, and lives at stake, if nothing else. In fact arguably we need more, but at what price?

Laurie Edmiston, the well-respected Executive Director of CATIE, is articulate when talking about Canada’s response to HIV. Of the Federal Initiative to Address HIV/AIDS in Canada she says “it was released over 10 years ago by the previous government, and its vague, motherhood areas of action refer to activities to be introduced between 2004-2005 and 2008-2009. No targets. No efforts underway to update it. Pretty mushy accountability.” 

She’s right of course. But does this “mushy accountability” exist elsewhere? Let’s look at some areas where we in Canada measure success – or not.

Surveillance – or “here’s looking at you, kid.”

“Surveillance” sounds ominous I know, but we’re talking the necessary art of gathering data about new HIV infections (incidence) where and to whom they are occurring and to numbers of people now infected with the virus (prevalence) also broken down by various demographics.

While the surveillance numbers in reports such as the annual HIV/AIDS Epi Updates are unavoidably part estimates (think about it and you’ll know why), they are invaluable in provide a broad measure of the success (or not) of our prevention efforts.

They can also dispel myths. Heard that new infections are on the rise in Canada again? They are not. Go to the latest report we have – from 2011 - and you’ll confirm that the number of new HIV infections in 2011 was approximately 3,175 which was about the same or slightly lower than the estimate for 2008. Heard that MSM (men who have sex with men) new infection rates are on the rise too? Well in Canada, they are not. The number of new HIV infections attributed to the MSM exposure category reached a peak from 1984 to 1986, decreased until 1999, then increased until 2005; it has remained stable since.

Want to know how many of us in Canada are currently living with HIV? It’s about 73,000.

Now I’m no epidemiologist (heck I can hardly spell it) but it is semi-reassuring that our country is accountable enough to be able to produce reports like this, whose usefulness is obvious.  Trouble is the latest data available is from 2011! Surely thirty-plus years into the epidemic we can do better than measuring the effectiveness of our response three years ago?

If you poke around on the internet you will find data, probably more recent than this, which pertains to your province, or even, if you live in large metropolitan area like Toronto, your city. That is perhaps more relevant to you, and to local service providers, than national data.

One area where we fall down almost across the board though, is that in the age of the treatment cascade we are not capturing enough data to truly measure how we are doing. More on that below.

The power of one

I don’t want to ignore the role of people living with HIV in monitoring and evaluating the state of their  own health and well-being. To do otherwise situates us as numbers, as mice to be studied, and I don’t like that.

So . . . if once tested, I seek access to care, go on treatment, adhere to it and keep my CD4 count high and my viral load down, preferably to an undetectable level, that is certainly one measure of success of my own efforts and those of my health care team. True, those numbers are not a complete measure of my mental and physical wellbeing and they certainly don’t always reflect the quality of my life or barriers still to be crossed, but they are nevertheless important measures of my body’s response to the virus that has invaded it. In that respect those numbers partially do the job. (Partial success is a theme, in fact that crops up often when we look at how, collectively, we are doing in the fight against HIV.)

Individually we are accountable to ourselves. Given the large numbers of people with undetectable viral loads these days, I’d argue that people living with HIV are in many cases doing a pretty fine job of maintaining good health. Increased testing rates in my home province are another measure that individual responsibility is alive and working.

The ASO army

Our AIDS Service Organizations (ASO’s) are no strangers to reporting and being evaluated, due in no small part to the requirements of their funders. This can result in multiple different filings – to both the provincial and federal governments in the case of Canada and to private funders if those also need to be satisfied where there money has been used.

An example of the funding/reporting mechanism in my own province is OCHART which seems unpopular with those at the front lines and administrative professionals who use it but doubtless serves the requirements of government funders (who provide $24.7 million in the case of the province of Ontario, the federal government contributing an additional $4, 2 million in 2012/13). Those reporting mechanisms produce useful data too, such as that contained in Ontario’s annual “View from the Front Lines”. The 2014 report is here and is a veritable mine of information, including on the epidemiology of the province.

Trouble is measurement tools used by funders tend to be quantitative in nature rather than qualitative. In other words, their stock in trade is numbers – contacts made, needles handled out, clients served, etc., rather than the quality of those interactions. The opportunities for qualitative evaluation of ASO are in fact very limited. That reflects a bias towards “upward” reporting to funders as opposed to “downward” reporting to members and clients.

What about accountability to clients? Again it’s mushy, with only the most basic reporting generally in evidence. Efforts to develop “bottom-up' evaluation tools where clients see their agency evaluated by themselves have not progressed well. An example is the “Denver Principles Empowerment Index”, a promising endeavour worked on by POZ founder and activist Sean Strub., which seems to have floundered. You can read about that his proposal here.

All in all, it’s hard to deny that accountability to clients is largely missing from ASO program and service evaluation. That’s an advocacy cause crying to be taken up. ASO’s need to be accountable to their funders, of course, but it shouldn’t end there. Without appropriate measurement indexes, people living with HIV have few opportunities to hold accountable those organizations that serve them. That needs to change.

Bring on the treatment cascade

This is where we really start to see the gaps. The treatment cascade, (sometimes called the engagement cascade by those who are uncomfortable with an emphasis on treatment of people living with HIV) has become an essential evaluation tool to monitor and react to our collective response to HIV. In simple terms it indicates the percentages of those with HIV who have in fact tested positive, those in care, those taking treatment and those who are virally suppressed.

So important is the treatment cascade concept that global strategies to bring the epidemic to its knees have been built around it. For example, UNAIDS has released global cascade targets that have come to be known as 90-90-90, The goal? Six years from now, 90% of all people with HIV will hopefully know their status, 90% of those will be on treatment and 90% of those will be virally suppressed.

Where does Canada stand? Sadly, we have no targets.

Where does my home province Ontario stand? Sadly, it has no targets

What is the ability of Canada to populate our country’s own treatment cascade with data, like knowing how many of us have undetectable viral loads?  Sadly, we don’t have it.

We are lagging. CATIE’s Laurie Edmiston says of the bits and pieces we can put togetherWhat it tells us is that we are behind other countries with similar publicly-funded health systems. Surprisingly, engagement in our cascades appears to be only slightly better than sub-Saharan Africa despite our comparatively well-funded health system. In fact, once people are diagnosed, sub-Saharan Africa does better than us in terms of getting them on treatment and suppressing viral load.” 

She adds “Targets are important. Targets promote accountability.”

In fairness, the data gathering requirement in Canada is being worked on as we speak. Ontario is about to release fairly comprehensive data from The Ontario HIV Treatment Network’s, (OHTN) Ontario Cohort Study (OCS) data base. So at least we will know more about the treatment cascade profile of those enrolled/engaged in care. And there are currently pan-Canadian consultations in their early stages led by the Public Health Agency of Canada (PHAC) to develop common reporting standards across all provinces, which is a major step towards country-wide reporting pf cascade-related data.

In the meantime we look longingly at jurisdictions which have got their act together faster and where accountability is less ‘mushy’ than in our own county. In fact treatment cascade data is starting to trickle in from all corners of the world.

In the US for instance, we have just seen data for those in the Ryan White program  where we learn that of those in the program some 82% were retained in care and 73% achieved virological suppression. A second report from the U.S. focussing on MSM who have been diagnosed, revealed that less impressive numbers - 51% were retained in care and 42% achieved virological suppression.

Then just last week from The Global Forum on MSM and HIV we heard more treatment cascade data compiled on a global basis. This indicated a significant drop-off among MSM at every point of the HIV care continuum, especially among young MSM and MSM in lower income countries; All in all, only 50% of MSM living with HIV surveyed were virologically suppressed.

The point of quoting this data is to show its obvious value to prevention and treatment advocates, strategists and policy-makers. Without this kind of reporting, we can only guess where the problems lie, and thus our efforts to address them are inevitably less effective.

The big picture

It seems apparent that where accountability is demanded by governments (and politics) of the day, it happens. There are a lot of gaps elsewhere. Where we’ve allowed that to happen can bite us in the foot quite badly. Witness our failure, compounded by the way responsibility for health care is distributed provincially, to be able to compile accurate nation-wide treatment cascade data. And witness too the failure of many (most?) AIDS Service Organizations to be accountable to the people and communities they serve. We owe it to ourselves to ask for more.

We need to develop an interest in the qualitative as much as we do the quantitative. People living with HIV can and should play a vital role in that process.

It’s important too to acknowledge that monitoring, evaluation and accountability are not the be-all and end-all of AIDS work. Too much of it and it distracts from the real work that needs to be done. Too much of it and we end up with bureaucracies. So our response to HIV should not be evaluated on reporting mechanisms alone. Ultimately they just produce numbers; it is the people behind those numbers that are important.

Finally, while Canada is lagging many countries in many measures of performance (the reasons for this are complex and can’t be covered adequately here) the picture is not all bad, Refer to my article from earlier this year “Is Canada Winning the Fight Against HIV?” and you’ll find a list of areas in which we excel as well as those we need to do better on. In that report I gave Canada a C- on its overall response to HIV; some people living with HIV suggested I had been generous, but I stand by that rating.

What do you think? Do we have a handle on the epidemic? Let us know.