The Slippery Slope of Leaving the Vulnerable Behind
The
SDGs: Leave No One Behind!! Laudable. Critical. And already failing at the
first hiccup.
The
recent SDG Index and Dashboard effectively aimed to
establish a baseline for SDG indicators in all countries. A lot of data is
needed (because there are A LOT of indicators - but that’s another subject).
There is significant pressure on national (and subnational) governments to
deliver the data to prove they are committed to change and sustainable
development.
The
problem is that in quite a few governments, particularly in small countries,
least developed countries and fragile states, capacity for data collection and
storage is weak. There are big data gaps, and disaggregated data is the
exception rather than the rule. That’s at the national level. At the
subnational level, the gaps are bigger, and data is weaker because data
collection methodologies are not reliable. But that doesn’t mean there isn’t
data - it’s just not perfect and not complete.
And
yet, because the data is not perfect or complete (or disaggregated) the authors
of the SDG Index opted to leave out a number of countries from its analysis.
The most disappointing element of this decision was the fact that the entire
Pacific region was excluded. How do you exclude an entire region? In
particular, a region that is wholly vulnerable to climate change and everything
associated with it that impedes development (food insecurity, lack of ICT, poor
health care and education services due to the extraordinarily high cost of
providing them). By not including them in the very first Index, you’ve already
left them behind.
The
authors of the report, including Professor Jeffrey Sachs, claim that they
couldn’t do analysis because government data was too poor, and too much was
missing. I ask this: for the whole region? Capacities in countries such as
Samoa, Cook Islands and Fiji aren’t wholly lacking - surely some data on at
least a few subjects were available? Gaps we can understand, but government
decisions do get made (across the Pacific) and are made based on data, so
surely some analysis could have been undertaken? (*also, can we raise the
following question: you designed 230
indicators for 17 SDGs, on things many have not considered before.
What did you think would happen?)
Secondly,
if we’re going to be devoted to the idea that no one should be left behind,
shouldn't we be committed to doing everything possible to ensure
success? Or is that a qualified commitment - you won't get left behind if your
government is good enough at doing its job? Get real - all UN Agencies, the
World Bank, the Asian Development Bank and the major donors in the region such
as Australia, New Zealand and the European Union have data - good data. Maybe
not on every SDG indicator for every country, but it’s there. Why doesn’t this
data count? Because it’s not government data? Why are the authors being so
rigid about what data ‘counts’? Is that even fair in countries that will
benefit the most from the SDGs? My goodness…
To
give an example of data that does exist, UN agencies were able to track MDG progress in each Pacific country,
including small countries such as Tonga, while UNDP’s Human Development Index has data for all
Pacific countries save Tuvalu, Nauru and Marshall Islands. It’s a place to
start, and if it’s not reliable, then we have bigger problems.
There
are arguments for both sides of this issue. On the one hand, some academics and
policy makers argue that in order to know whether or not someone is being left
behind, we need good data, and it needs to be disaggregated (by sex, age,
region, ethnicity, education levels, etc). Without disaggregated baseline data,
they argue, we are only guessing at a
situation and we couldn't possibly measure the success (or failure) of policies
and programmes.
On
the other hand, some academics and practitioners argue that
there is a large difference between poor data and wholly useless data. In the
Pacific, Professor Sachs is likely correct in arguing that data is poor, but is
it useless? No. As noted in the link above, data always tells us something,
even if it is not enough. It can shed light on problems and provide a guide on
how to fix them - although we’ll have to fill in the gaps along the way.
Perhaps this is where that really useful development partner and donor data
could come into play? Hmmmm….
It
brings up the issue of why, after so many years of investment in governance,
government data on development issues is still so poor in many countries.
First, too often the development community works outside of government to
collect data and provide estimates. But, as argued here, estimates will never create functioning,
sophisticated government data systems. Second, in particular in relation to the
SDGs, it is intergovernmental bodies that are responsible for designing most of
the data standards that we adhere to, and those
standards focus primarily on serving international institutional needs over
national government needs, particularly in developing countries. To sum it up,
the development community has failed, for the most part (until recently), to
use government systems to improve capacity to collect data and the data itself,
and secondly, the data the development community needs is for itself, and
doesn’t necessarily have added value in any particular country. Super.
Rather
than owning up to why developing countries have poor data, the authors
of the SDG index shamed them by excluding them from the Index entirely. Instead
of doing all possible to ensure some of the most vulnerable countries were
counted - in the spirit of the SDGs and all - by utilizing reliable
supplementary data, the authors of the SDG Index chose to take the easy way out
and throw an entire region under the bus (or the waves, as it were) for not
being good enough. You want people to commit to the SDGs, to make sacrifices
and do whatever is possible to succeed? Don’t exclude them from the get go.
We’re
eight months into the SDG era, in which we promised to ‘leave no one behind’.
We already have.
Comments
Post a Comment