In recent months, many allegations have been made about fudging of official data by government departments in both the centre and the states. How do the allegations hurt the credibility of institutions and what safeguards can be put in place? We ask experts.
The art of book-cooking and “smart” accounting hid the true state of Punjab’s finances — Manpreet Badal, Punjab Finance Minister
On assuming office on March 16, 2017, the Congress government was shocked to learn that the financial position of the state was far worse than even our wildest imagination. The treasury was virtually lying closed under the weight of the bills amounting to Rs 13,000 crore pending clearances, farmers were all set to sell their Rabi crop, and the government was awaiting sanction of the Cash Credit Limit (CCL). And then came the RBI shocker – a letter on March 29 suspending all payments to the state.
Normally, budget documents and Comptroller and Auditor General (CAG) accounts ought to reflect the true state of finances. But “smart” accounting practices and the art of cooking books had been deployed by the previous government including fast-forwarding receipts, deferring expenditure, non-provisioning for contingencies, and incurring liabilities outside the budget.
The integrity of the budget presented by the government becomes deeply suspect and blunts government accountability.
Most of the outstanding liabilities relate to benefits for the scheduled castes and other disadvantaged sections of society, project funds from the central government, awards for the Central and State Finance Commissions in favour of Urban Local Bodies and Panchayati Raj Institutions. These liabilities are also likely to block the future development of the state.
Another very disturbing feature was abusing various entities like Punjab Infrastructure Development Board, Punjab Urban Planning and Development Board, Rural Development Board to take loans by mortgaging their future revenues or by hypothecating immoveable properties at their disposal. This is akin to selling family silver to run your kitchen. Raising of loans was an off-budget exercise, and escaped due scrutiny, legislative approval and CAG audit.
India’s system of official statistics leaves room to ignore politically uncomfortable data — Abhijit Sen, former member, Planning Commission of India
New political formations want to show positive results quickly as well as claim that problems are a legacy of past regimes. Allegations of data fudging have been more frequent in the last three years than earlier, perhaps because of a change in government.
India’s system of official statistics is robust enough to prevent gross tampering. But the system remains patchy and the data is unable to capture some changes. There is a scope to sometimes ignore the weak parts that the data throws up, if it is politically uncomfortable.
Our data on the informal sectors is not only weak, there have been no results after 2014 from the National Sample Survey. Consequently, the impact of demonetisation was gleaned from data on formal sectors and agricultural production. There are discrepancies even in the formal sector data – e.g. our index of industrial production throws up much lower rates of growth than the present method of deflating production values from company balance sheets by the wholesale price index. However, this is not fudging, but
a failure to make the existing system more robust.
Recent allegations of false data include the Beti Bachao schemes in Haryana and the public finance data that the new Punjab Finance Minister inherited. Neither are novel. Haryana’s effort to show massive improvement in child sex ratios within a short period is similar to “achievements” of Nirmal Grams in the past and of Open Defecation Free States more recently. Such claims are based on internal Management Information Systems (MIS) which measure efforts and outputs rather than outcomes, which can only be meaningfully assessed after the next Population Census in 2021. However, rushing with good news from MIS is natural politics as is the opposition playing up doubts expressed by researchers.
One important reason for poor quality GDP growth data is that the government does not need it – NEELKANTH MISHRA, India Equity Strategist, Credit Suisse
Human success, dependent on the ability to cooperate in ever larger numbers, needs an efficient, trustworthy and accurate signaling mechanism. In large groups, this is data. In its absence societies can become dysfunctional. Take, for example, the absence of a learning curve in our policy-making. We cannot even agree on inflation data, let alone hard-to-measure growth indicators or designing schemes like gold monetisation bonds that have been running unsuccessfully for decades.
Best quality data comes only if it is important to the aggregator and is collected as a byproduct of activity. This holds true even in large multinational corporations. Invoices add up to revenues and are therefore a good source: ask someone to key in sales numbers, and you are taking a risk.
Official statistics fail both these tests. One important reason for poor quality GDP growth data is that the government does not need it. Quarterly GDP is calculated only because the IMF wants it.
Field officers can cook up numbers either out of sloth or greed: inflation spiked in April 2007 because someone woke up after six months and updated steel prices. Aerial surveys have shown large insurance claims of cotton in areas that had not sown cotton for generations. Crop yields are under-reported to help a farmer claim insurance. The Central Statistics Office (CSO) as the aggregator is often blamed, but they are messengers: the source of data is the problem.
The CSO is enthusiastic about the deluge of granular, high-frequency, accurate data expected from GST. But the transition may create more disruption because a large part of GDP in the past has been projected from service tax/sales tax collections: these will no longer be available.
Requests for the raw data go unanswered — Mihir Sharma, Senior Fellow & Head of Economy and Growth Programme at Observer Research Foundation
Most government departments could tamper with data if they try, but that doesn’t mean they will, or have been doing so. Most often, the data they release will be accurate. But official data can lose credibility, even if it is not inaccurate. Data credibility requires it to be comparable over time, to be gathered in an inclusive manner, operated on in a transparent manner, and presented clearly. Many government departments can choose to fall short in one or more of these stages, leading to a loss of credibility in their data.
One way to note if this is happening is to see if data on the department’s website is presented in a regularly updated manner, and cross-check that with, first, press releases from the department and, second, answers to questions in Parliament. The last-named should be considered the most credible form of data, as bureaucrats know the consequences of misleading Parliament.
What frequently happens, however, is that departments hang on to the raw data they receive – of scheme outcomes, for example. They process that data in some way, slicing and dicing it till they find the way of presenting it in a way that looks most politically palatable. That then makes the headlines in the newspapers. Requests for the raw data underlying that press release can go unanswered.
To make data more credible, it is necessary to ensure that all such data is open to RTI queries. I would also suggest that an authority like the CAG be instituted that has the right, on behalf of Parliament, to make data-driven demands of departments and ministries on an ongoing basis.
Most data on government programs focus on inputs rather than outcomes – Devesh Kapur, director of Center for the Advanced Study of India, University of Pennsylvania
“Statistics make officials, and officials make statistics.’’ – Chinese proverb
Concerns over fudging of official data, while valid, are overblown relative to the less sensational but deeper challenges facing the collection of official data in India. Official data in India suffers from issues related to validity, coverage, and accuracy. Validity refers to the relationship between concept and collected information i.e. the connection between what is actually measured and what is purported to have been measured; coverage refers to the completeness of data i.e. the presence or absence of the data needed for a given policy question; and accuracy refers to the correctness or avoidance of errors in the data.
Most data on government programs focus on inputs rather than outcomes. There is very good data on the construction of toilets under the Swachh Bharat Yojana , but very little on actual usage. And coverage is undermined by inordinate delays in releasing the data, by which time it is not very useful for policy. The Census and National Family Health Surveys are good examples of this.
However, the limitations of official data are increasingly due to weak human capital capabilities. Statisticians in the government are recruited through the UPSC. Despite the large number of applicants taking the exam, in the case of the Indian Statistical Service, the number of applicants and application to-post ratio is among the lowest in the all-India services. Furthermore, in many years it is the only service where the recommendation-to-post ratio has been less than one, implying that qualified candidates are unavailable. If a country of a billion plus people cannot find a few dozen qualified statisticians annually to staff its statistical bureaucracy, what does that say about its ability to generate good data?
India’s GDP estimates mystery: Circumstances, not malice – Praveen Chakravarty, Senior Fellow, IDFC Institute, Mumbai.
When the Central Statistics Office (CSO) announced that India’s GDP grew at 7 per cent in the quarter during which demonetisation was implemented, it was received with bewilderment. When the CSO estimated India’s GDP growth at 8 per cent for FY16, experts and analysts lamented that it does not “feel like” an 8 per cent growth economy. All other indicators pointed otherwise – poor trade numbers, abysmal credit growth, lack of private investment and so on. The CSO attracted global criticism for putting out numbers that did not match “lived realities”. India was tainted with data fudging allegations. This is a tad unfair.
The CSO adopted a new GDP estimation methodology in 2015. It had three big changes:
1. Change of base year which is a routine and accepted change
2. Expanding the dataset to incorporate new measures — needed to ensure GDP adequately captures the changing nature of India’s economy
3. Change in methodology to estimate GDP through a superior “value added” measure rather than just production of goods and services.
These three changes have rendered it impossible to reconcile current GDP estimates with past trends. The CSO has been unable to tell us what the previous years’ GDP estimates would have been under the new methodology because it is difficult to go back in time and collect new data points. There is also scepticism of the CSO’s numbers, because the new dataset has not been made public.
But insinuations of connivance or deliberate fudging of data smacks of ignorance of the institutional strength of the CSO and the technocratic professionalism of the Chief Statistician of India. As they say: ‘Do not attribute malice to what can be explained by circumstances.’
https://praveenchakravarty.in/macro-economy/indias-gdp-estimates-mystery-circumstances-not-malice/