Journal of Global Infectious DiseasesOfficial Publishing of INDUSEM and OPUS 12 Foundation, Inc. Users online:608  
Print this pageEmail this pageSmall font sizeDefault font sizeIncrease font size     
Home About us Editors Ahead of Print Current Issue Archives Search Instructions Subscribe Advertise Login 
 


 
   Table of Contents     
PUBLIC HEALTH RESEARCH  
Year : 2012  |  Volume : 4  |  Issue : 2  |  Page : 120-127
Monitoring data quality in syndromic surveillance: Learnings from a resource limited setting


1 Health Division UNDP, Odisha; Department of Community Medicine, Institute of Medical Sciences and SUM Hospital, Bhubaneswar, Odisha, India
2 Health Division UNDP, Odisha; Division of Epidemiology, School of Public Health, SRM University, Chennai, India
3 Health Division UNDP, Odisha; United Nations Population Fund, Odisha, India
4 Health Division UNDP, Odisha; Health Division, UNICEF, Odisha, India
5 Health Division UNDP, Odisha; Epidemiologist, New Delhi, India

Click here for correspondence address and email

Date of Web Publication30-May-2012
 

   Abstract 

Background: India is in the process of integrating all disease surveillance systems with the support of a World Bank funded program called the Integrated Disease Surveillance System. In this context the objective of the study was to evaluate the components of the Orissa Multi Disease Surveillance System. Materials and Methods: Multistage sampling was carried out, starting with four districts, followed by sequentially sampling two blocks; and in each block, two sectors and two health sub-centers were selected, all based on the best and worst performances. Two study instruments were developed for data validation, for assessing the components of the surveillance and diagnostic algorithm. The Organizational Ethics Group reviewed and approved the study. Results: In all 178 study subjects participated in the survey. The case definition of suspected meningitis in disease surveillance was found to be difficult, with only 29.94%, who could be correctly identified. Syndromic diagnosis following the diagnostic algorithm was difficult for suspected malaria (28.1%), 'unusual syndrome' (28.1%), and simple diarrhea (62%). Only 17% could correctly answer questions on follow-up cases, but only 50% prioritized diseases. Our study showed that 54% cross-checked the data before compilation. Many (22%) faltered on timeliness even during emergencies. The constraints identified were logistics (56%) and telecommunication (41%). The reason for participation in surveillance was job responsibility (34.83%). Conclusions: Most of the deficiencies arose from human errors when carrying out day-to-day processes of surveillance activities, hence, should be improved by retraining. Enhanced laboratory support and electronic transmission would improve data quality and timeliness. Validity of some of the case definitions need to be rechecked. Training Programs should focus on motivating the surveillance personnel.

Keywords: Data quality, Evaluation, Infectious disease surveillance

How to cite this article:
Venkatarao E, Patil RR, Prasad D, Anasuya A, Samuel R. Monitoring data quality in syndromic surveillance: Learnings from a resource limited setting. J Global Infect Dis 2012;4:120-7

How to cite this URL:
Venkatarao E, Patil RR, Prasad D, Anasuya A, Samuel R. Monitoring data quality in syndromic surveillance: Learnings from a resource limited setting. J Global Infect Dis [serial online] 2012 [cited 2019 Jul 16];4:120-7. Available from: http://www.jgid.org/text.asp?2012/4/2/120/96778



   Introduction Top


Syndromic surveillance has been defined as the ongoing systematic collection, analysis, interpretation, and application of real-time (or near-real-time) indicators of diseases and outbreaks that allow for their detection, before public health authorities would otherwise note them. [1] It has also been defined as "...surveillance using health-related data that precede diagnosis and signal a sufficient probability of a case or an outbreak to warrant further public health response". [2] Syndromic approach complements the disease-specific approach, with a precise definition for each syndrome, and was pilot-tested in 21 countries. Development and field testing of syndromic reporting initially identified five syndromes of potential public health importance. After the interim review, the World Health Organization (WHO) concluded that syndromic reporting could be useful. The uniqueness of syndromic surveillance lay in its ability to detect outbreaks of diseases that do not fall into the current WHO case classifications, which is particularly important for emerging diseases, such as Severe acute respiratory syndrome (SARS). [3]

There is a demand for increased surveillance under international regulation with increasing risk of international pandemics, hence, it is important to evaluate and implement new surveillance systems to increase the probability success. [4]

A five country evaluation of data structures supporting healthcare systems in developing countries, across four continents, identified a number of structural impediments (timeliness, accuracy, simplicity, flexibility, acceptability, and usefulness) to an effective health information system. [5]

It was important that surveillance systems avoided unnecessary duplication, and hence, evaluation of such systems should emphasize on improving the quality and efficiency of outbreak detection. [6] Statistical methods for disease surveillance focused mainly on the performance of outbreak detection algorithms and did not pay sufficient attention to the data quality and representativeness, two factors that were especially important in developing countries. [7]

Inadequate data quality may impair our understanding of the true disease epidemiology, compromise the core program functions, and undermine our ability to meet the disease control objectives. [8] The probability of outbreak detection is adversely affected if the data generated from the surveillance system is of inferior quality, therefore, it is extremely important to continuously monitor and evaluate surveillance systems, to ensure a good performance and efficient use of resources. [6]

The orissa multi disease surveillance system

The Government of Orissa had set up the Orissa Multi Disease Surveillance System (OMDSS) in 1999 [Box 1]. The reporting units are the existing government health units. Reporting is carried out weekly on 12 syndromes. [9] OMDSS has been merged with the Integrated Disease Surveillance Program of the Government of India, since 2006, which is a system that draws its origins and learning from successful models such as the OMDSS.



Objective of the study

The objective of our study was to evaluate the components of the OMDSS surveillance system, like accuracy of case detection, data recording, data compilation, and data transmission, and look into the related determinants that have a bearing on the data quality.

Study setting

Orissa is one of the least urbanized states in India, with the rate of urbanization being only 14.97% (2001 census). [10] The health indicators are poor, as compared to the other states of the country with poor infrastructure, lesser health staff, and fewer resources. The healthcare system in the state is operational through the primary healthcare approach, in all the 30 districts. The proportion of tribal population in the state is 22%, [11] and is the highest in the entire country.


   Materials and Methods Top


Development of study tools

Two study instruments were developed to evaluate OMDSS, keeping in mind its unique characteristics. The first one focused on the components of disease surveillance, like case detection, data recording, data compilation, and data transmission. The second tool, a diagnostic algorithm, assessed the ability of the study subject in identifying cases through a syndromic approach.

Data collection

The study was conducted during the period of May - June 2005. Four qualified researchers, with a past experience in conducting field research in the health sector, were recruited for the field survey. They were extensively trained to be familiar with the existing surveillance system and use of tools for the field survey.

Study design and sampling

A sample was selected using the multistage sampling method. First, four districts were purposively selected for representing four diverse geographical regions of the state, with the representative demographical and epidemiological features chosen. In the second stage, two blocks were sampled from each district to obtain a contrast sample of two blocks that were considered to exemplify 'good' and 'poor' reporting based on the review of the reporting statistics during the year 2004. Similarly, in the third and fourth stages, in each block two sectors and in each sector two health sub-centers were selected, keeping reporting performance as the criteria.


   Results Top


In all 178 study subjects participated in the survey. Health workers represented the most (52.8%), followed by medical officers (14.6%) [Table 1].
Table 1: Designation of the study subjects (n = 178)

Click here to view


Case detection

A majority (93.41%) were trained on principles of disease surveillance. All the study subjects were aware of the weekly reporting pattern under OMDSS and that the reporting week was from saturday to friday. Of them 93.6% agreed that they followed the standard case definitions; 86.6% of the subjects were able to enlist more than 10 disease categories out of the 12 under surveillance. The case definition of suspected meningitis was reported to be most difficult (29.94%), followed by the unusual syndrome (26.34%) and neonatal tetanus (8.98%) [Table 2]. Syndromic diagnosis following the diagnostic algorithm [Table 3] was difficult for suspected malaria and the unusual syndrome, with only 28.1% of the correct diagnosis and was followed by 62% of the correct diagnosis for simple diarrhea. The survey revealed that 93.5% of the subjects were aware of the fact that deaths were captured in the system.
Table 2: Disease categories with case definitions difficult to understand (n = 167)

Click here to view
Table 3: Syndromic diagnosis (n = 167)

Click here to view


Data recording

The participants were asked, how they prioritized the recording of a disease, when a patient presented with two or more diseases. Only 17% could correctly answer the question on how to record a case as old or new, when a patient presented with fever and returned back with the investigation reports after two days [Table 4]. Only 50% could correctly decide the priority disease to be recorded, when given an instance of a case of measles with diarrhea, but no dehydration.
Table 4: Recording of correct diagnosis following the thumb rules of case detection (n = 178)

Click here to view


Data compilation

Close to 76% were aware of tallying. However, only 53.37% of them could explain the process of tallying. Fifty-four percent agreed that they cross-checked the data before compilation. In the four weeks preceding the survey, the number of errors that were identified by the subjects was analyzed. A score of one was given to each valid error. The results revealed that among the medical officers, five (19.23%) scored one, seven (26.92%) scored two, two (7.69%) scored three, and only one (3.84%) scored four. The non-response rate was 42%. However, among the other staff, the score was either one (34.86%) or two (48.02%). Only 10 subjects (6.57 %) had scored three.

Data transmission

A majority (82.58%) of the subjects were aware of all the three emergency instances requiring immediate reporting to a higher authority. However, a substantial proportion, close to 17%, were aware of only one of the three instances, and waited till they received reports from all the units before sending it to the higher authority [Table 5].
Table 5: Knowledge of the instances requiring immediate reporting to the higher authorities (n = 178)

Click here to view


Human and logistics factors

The constraints found were, non-availability of tally sheets and reporting formats (56%), and second, lack of communication equipments like phone/fax/internet (41%) [Table 6]. The reasons given for participating in OMDSS were job responsibility (34.83%), usefulness to the Health System (25.28%), and that it helps epidemiologically (0.56%) [Table 6] and [Table 7].
Table 6: Constraints faced by the health personnel in executing DS activities (n = 178)

Click here to view
Table 7: Reason for accepting OMDSS (n = 178)

Click here to view



   Discussion Top


Disease surveillance systems all over the world use three levels of data types, namely, preclinical data, clinical pre-diagnostic data, and diagnostic data. Preclinical and clinical pre-diagnostic data are generally used by syndromic surveillance, whereas, traditional surveillance mainly relies on diagnostic data. [12]

Case detection

In our study, we had a higher number of paramedical staff as compared to medical officers, as OMDSS is a syndromic surveillance system; hence, an overwhelming workload is shared by the paramedical staff. Our study found that a case definition for meningitis was the most difficult for case detection (29.94%). Other studies have also reported similar problems with case definition and diagnosis of meningitis affecting the positive predictive value of meningitis. [13] It was disconcerting to note that only 28% of our study subjects could accurately diagnose suspected malaria and the unusual syndrome. Malaria being a highly endemic disease in Orissa, we recommended repeated training and reinforcement at regular intervals, to familiarize with the case definitions and syndromic diagnosis for malaria. It has been proven beyond doubt that syndromic surveillance has time and again demonstrated to be an effective mechanism for early detection of the malaria epidemic, from the experience of the African continent. The case in point is Ethiopia, where weekly percentile cutoffs proved to be an efficient tool for the detection of a malarial outbreak, and hence, there was no need to always rely on complicated algorithms. [14]

The standard method for characterizing data quality, measures the sensitivity and specificity with which the data can accurately classify patients relative to a criterion determination (gold standard). [15] The importance of regular training of surveillance personnel cannot be overemphasized as it helps not only in enhancing the sensitivity of the surveillance system, but also betters case detection and disease reporting indicators. [16]

Data recording

In OMDSS, training was given on the process of recording a priority disease, when a patient came in with multiple diseases, with logical reasoning, by following the thumb rules of case detection. Yet 17% of the respondents committed errors in recording cases as new or old. A majority, 50%, of the participants could not prioritize the disease for surveillance recording nor give any logical reasoning when a patient presented with two or more diseases. Accurate, complete, and timely information improves the quality of surveillance data and supports public health decision-making.

Measurements of disease frequencies get distorted when there are serious diagnostic misclassifications taking place in the surveillance system, and errors in data recording are rampant. [17],[18] One of the ways to improve data quality in a surveillance program is to boost up the laboratory back-up and switch over to the electronic reporting mechanisms. [19],[20] Reporting errors can also be brought down considerably by imparting regular refresher training to personnel involved in surveillance. [21],[22]

Data compilation

Our study shows that 54% cross-checked the data before compilation. Several other studies have pointed out errors in data. Deficiency in reports gathered in public health systems has been noted across the world. Studies from South Africa have shown that there have been serious defects in their death notification systems. [23] Errors have been found in nearly all death notification forms (91%), with a major error detected in 43% of the instances, resulting in documentation of an illogical sequence or cause of death. [24]

The quality of data is influenced by the clarity of surveillance forms, the quality of training, supervision of persons who completed the surveillance ms, and the care exercised in data management. A review of these facets of a surveillance system provides an indirect measure of the quality of data. Examining the percentage of unknown or blank responses to items on the surveillance forms or questionnaires is straightforward. Assessing the reliability and validity of responses would require special studies such as chart reviews or re-interviews of respondents. [6] In conducting a public health investigation, the first task is to differentiate natural (statistical) variability from 'pseudo-outbreaks' due to data entry or coding errors from a true increase in an infectious illness. [25] This is one of the reasons why many experts question the capacity of syndromic surveillance to provide an early outbreak detection, as there is a lack of precision in the data it generates. [26] Therefore, a potential data source should be judged by the combination of its data quality and timeliness, as well as, knowledge of the cost of false alarms versus the cost of delays in triggering true alarms for a specific disease threat. [25]

Data transmission

In our study, a substantial proportion, 22% of the subjects, waited till they received reports from all the units before sending it to the higher authority, even when the event needed emergency notification.

New strategies need to be evolved, especially those aimed at reporting personnel, as they remain the most important component of the disease surveillance systems. [19] Syndromic surveillance does not need to be highly computerized or technical; its tools can be simple, using few technological or human resources, and can complement the existing surveillance programs. [3] Many researchers have already proven that rather than the conventional methods of morbidity reporting, electronic transmission of reports lead to a 2.3-fold increase in case reports, therefore, every attempt should be made to upgrade the method to automated data transmission. [16] Electronic tools such as the phone and internet have been seen to be efficient modes of data transmission. [27]

Timeliness

In OMDSS only 69% of the reporting units had sent the weekly diseases surveillance report on time. Our study findings were very similar to other studies reporting timeliness. [28] It had been observed that timeliness had been area of concern in the entire disease surveillance program across the world.

The pertinence of timeliness in public health surveillance systems was also underlined by several authors. The importance of a regular evaluation of timeliness has been stressed time and again, as it is the crucial component of the disease surveillance system. [29],[30] Timeliness becomes all the more crucial if data from surveillance system are to be used in achieving the objectives of public health, namely, disease control and disease prevention.

Studies have shown that timeliness of reporting is identified as the system attribute most amenable to improvement. [13] In an effort to improve the efficiency of the disease surveillance system, many electronic and automated innovations have been tried. [31],[32] Among all the strategies, a simple act of reminding the surveillance personnel through phone calls has proved to be the most tactical way of improving timeliness. [28] The other assured method of improving timeliness has been the adoption of the electronic transmission of reports, as it results in the speedy arrival of reports. The implementation of electronic-based platforms, improves timeliness, and facilitates access to the epidemiological data allowing more rapid analysis and response. [19]

Completeness of data

In the OMDSS the flow of information in the system has been very efficient with more than 90% of the 358 reporting units (blocks and hospitals) sending reports every week, since April 2002 onwards.

Some measure of reporting completeness is necessary to accurately interpret disease incidence or to make national and international comparisons among public health jurisdictions. [16] Routine notifiable disease surveillance often suffers from incomplete reporting. It has been reported that completeness of notifiable infectious disease reporting in the United States varies from 9 to 99%. [33] It has been observed that surveillance systems that have not upgraded to an automated system of disease monitoring and reporting and still carry on with the conventional methods, invariably experience a delay in notification of vital events. [34],[35]

Many researchers have tried to improve the completeness of the surveillance system through the Capture-Recapture method, also referred to as the underascertainment corrected method. Whether this corrected method results in a more accurate estimate of reporting the completeness depends on how these methods are applied and the individual characteristics of the data sources being used. [16]

Data analysis and alarm thresholds

The very purpose of the disease surveillance system is early detection of epidemics and the response. This can only be achieved if the surveillance program has good data management and analysis systems, to enable identifying of the threshold level, to serve as alarms. [36],[37] Since the 1990s, syndromic surveillance has proven its utility as a reliable instrument in the early detection of outbreak. [38] The utility of syndromic surveillance is dependent on the alarm thresholds. The WHO has advocated alerts when weekly cases exceed 75% of the baseline? [3] Some methods used are CPEG, commonly used by both the military systems, and currently coded 0 if the observed data are not outside the historical limits ('normal' situation), + if the observed data are outside the historical limits by more than two standard deviations ('pre-alarm'), and ++ if by more than three standard deviations, compared to the expected data ('alarm'). It is important to be aware that syndromic surveillance is associated with an increased risk of false alarms, [39] hence, we need to be extremely cautious in the interpretation of epidemic alerts, because investigation of false signals places a significant burden on the staff resources. Therefore, alarm thresholds should be set based on explicit utility considerations that attempt to optimize the tradeoff between the cost of false alarms and the expected benefits of earlier detection. [25]

Surveillance manpower

Human resource is a vital component of the surveillance system. In our study many constraints were found affecting the efficiency of the surveillance activity, namely, nonavailability of tally sheets, reporting formats (56%), and lack of communication equipments like phone/fax/internet (41%).

Lack of required skills among the surveillance personnel was reported by many studies, and it was found that there was multitasking done by health workers, who were given the additional role of surveillance activity. They were otherwise not directly involved in the primary healthcare activities, and this has been the primary reason for the inefficiency. [28]

In our study, we also looked at motivation for the study participants to be part of surveillance activity. The reasons given for participating in the OMDSS were job responsibility (34.83%) and usefulness for the health system (25.28%). Many researchers have highlighted the paramount importance of having highly motivated reporting personnel in the surveillance programs, as it has a direct bearing on the data quality and timeliness. If health workers are involved in the surveillance program merely as an additional job responsibility, then they will invariably display poor commitment. [40] The notification rates can be improved significantly by improved coordination between the health personnel involved in the clinical and public healthcare activities. [35] Reporting activities can be considerably improved if health workers are regularly visited and motivated. [41]

This analysis supports the recommendations of the WHO, who argue for simplified data collection tools, a minimal common set of key indicators, reduced numbers of registers, and allocation of dedicated, trained personnel at the local level, to maintain patient records and reports. [24]


   Conclusions Top


The personnel involved in surveillance activities need to be given refresher training and monitored regularly for quality checks. The manual nature of the surveillance activity increases human error. Laboratory support at all levels and automation of data transmission will improve data quality and timeliness. Some of the case definitions, especially malaria, need to be reviewed. The motivation of surveillance personnel need to be boosted.


   Acknowledgment Top


RS conceptualized the study. DTM and AA helped in executing the study. EVR coordinated the study. RRP prepared the manuscript. All authors were involved in various stages of data collection and analysis. All authors contributed toward finalizing the manuscript.

 
   References Top

1.Sosin DM. Syndromic surveillance: The case for skillful investment view.Biosecur Bioterror 2003;1:247-53.  Back to cited text no. 1
    
2.Centers for disease control and prevention. Available from: http://www.cdc.gov/biosense/publichealth.htm. [Last accessed on 2011 Sep 04].  Back to cited text no. 2
    
3.May L,Chretien JP, Pavlin JA. Beyond traditional surveillance: Applying syndromic surveillance to developing settings - opportunities and challenges. BMC Public Health 2009;9:242.  Back to cited text no. 3
    
4.World Health Organization: The World Health Report- A Safer Future. Global Public Health Security in the 21 st Century. Geneva, Switzerland. 2007.  Back to cited text no. 4
    
5.Mate KS, Bennett B, Mphatswe W, Barker P, Rollins N.Challenges for routine health system data management in a large public program to prevent mother-to-child HIV transmission in South Africa. PLoS One 2009;4:e5483.  Back to cited text no. 5
    
6.Klaucke DN, Buehler JW. Centers for disease control (CDC). Guidelines for evaluating surveillance systems. MMWR Morb Mortal Wkly Rep 1988;37 Suppl 5:1-18.  Back to cited text no. 6
    
7.Lescano AG, Larasati RP, Sedyaningsih ER, Bounlu K, Araujo-Castillo RV, Munayco-Escate CV, et al. Statistical analyses in disease surveillance systems. BMC Proc 2008;2(Suppl 3):S7.  Back to cited text no. 7
    
8.Sprinson JE, Lawton ES, Porco TC, Flood JM, Westenhouse JL. Assessing the validity of tuberculosis surveillance data in California. BMC Public Health 2006;6:217.  Back to cited text no. 8
    
9.Orissa Multi Disease Surveillance System. Health and Family Welfare Department, Government of Orissa. Available from: http://www.orissa.gov.in/health/diseasesurveillance.htm [Last accessed on 2010 Nov 28].  Back to cited text no. 9
    
10.Department of health and family welfare, Government of Orissa. Available from: http://www.orissa.gov.in/health_portal/healthprofile/profile.html. [Last accessed on 2010 Nov 29].  Back to cited text no. 10
    
11.Proportion of tribal population in Orissa. Available from: http://www.orissa-tourism.com/tribesdata.htm. [Last accessed on 2010 Nov 29th].  Back to cited text no. 11
    
12.Berger M, Shiau R, Weintraub JM. Review of syndromic surveillance: Implications for waterborne disease detection.J Epidemiol Community Health 2006;60:543-50.  Back to cited text no. 12
    
13.Gagnon ER, Christine A, Talbot EA. Timeliness is of the essence: A systematic evaluation of disease reporting in New Hampshire. The 134 th Annual Meeting and Exposition November 4-8. APHA. 2006.  Back to cited text no. 13
    
14.Teklehaimanot HD, Schwartz J, Teklehaimanot A, Lipsitch M. Alert threshold algorithms and malaria epidemic detection. Emerg Infect Dis 2004;10:1220-6.  Back to cited text no. 14
    
15.Mandl KD, Overhage JM, Wagner MM, Lober WB, Sebastiani P, Mostashari F, et al. Implementing syndromic surveillance: A practical guide informed by the early experience.J Am Med Inform Assoc 2004;11:141-50.  Back to cited text no. 15
    
16.Doyle TJ, Glynn MK, Groseclose SL. Completeness of notifiable infectious disease reporting in the United States: An analytical literature review. Am J Epidemiol 2002;155:866-74.  Back to cited text no. 16
    
17.Brenner H. Effects of misdiagnoses on disease monitoring with capture-recapture methods. Clin Epidemiol 1996;49:1303-7.  Back to cited text no. 17
    
18.Washko RM, Frieden TR. Tuberculosis surveillance using death certificate data, New York City, 1992. Public Health Rep 1996;111:251-5.  Back to cited text no. 18
    
19.Huaman MA, Araujo-Castillo RV, Soto G, Neyra JM, Quispe JA, Fernandez MF, et al. Impact of two interventions on timeliness and data quality of an electronic disease surveillance system in a resource limited setting (Peru): A prospective evaluation.BMC Med Inform Decis Mak 2009;9:16.   Back to cited text no. 19
    
20.Effler P, Ching-Lee M, Bogard A, Ieong MC, Nekomoto T, Jernigan D. Statewide system of electronic notifiable disease reporting from clinical laboratories: Comparting automated reporting with conventional methods. JAMA 1999;282:1845-50.  Back to cited text no. 20
    
21.Abdool Karim SS, Dilraj A. Reasons for under-reporting of notifiable conditions. S Afr Med J1996;86:834-6.  Back to cited text no. 21
    
22.Prato R, Napoli C, Barbuti G, Germinario C, Lopalco PL. General practitioners and mandatory surveillance of communicable diseases: A descriptive study in Puglia (South Italy). Ann Iq 2004;16:449-55.  Back to cited text no. 22
    
23.Burger EH, Van der Merwe L, Volmink J. Errors in the completion of the death notification form. S Afr Med J 2007;97:1077-81.  Back to cited text no. 23
    
24.Mate KS, Bennett B, Mphatswe W, Barker P, Rollins N. Challenges for routine health system data management in a large public program to prevent mother-to-child HIV transmission in South Africa. PLoS One 209;4:e5483.  Back to cited text no. 24
    
25.Mandl KD, Overhage JM, Wagner MM, Lober WB, Sebastiani P, Mostashari F, et al. Implementing syndromic surveillance: A practical guide informed by the early experience. J Am Med Inform Assoc 2004;11:141-50.  Back to cited text no. 25
    
26.Cooper DL, Verlander NQ, Smith GE, Charlett A, Gerard E, Willocks L, et al. Can syndromic surveillance data detect local outbreaks of communicable disease? A model using a historical cryptosporidiosis outbreak. Epidemiol Infect 2006;134:13-20.  Back to cited text no. 26
    
27.Chretien JP, Burkom HS, Sedyaningsih ER, Larasati RP, Lescano AG, Mundaca CC, et al. Syndromic surveillance: Adapting innovations to developing settings. PLoS Med 2008;5:e72.  Back to cited text no. 27
    
28.Sathyanarayana. An evaluation of integrated diseases surveillance project bellary unit. Karnataka State, India. Available from: http://www.technet21.org/Tools_and_resources/pdf_file/KarnatakaEvalSurv.pdf. [Last accessed on 2010 Nov 30].  Back to cited text no. 28
    
29.German RR, Lee LM, Horan JM, Milstein RL, Pertowski CA, Waller MN. Guidelines working group centers for disease control and prevention (CDC). Updated guidelines for evaluating public health surveillance systems: Recommendations from the Guidelines Working Group. MMWR Recomm Rep 2001;50(RR-13):1-35.  Back to cited text no. 29
    
30.Centers for disease control (CDC). Guidelines for evaluating surveillance systems. MMWRMorb Mortal Wkly Rep1988;37 Suppl 5:1-18.  Back to cited text no. 30
    
31.Birkhead G, Chorba TL, Root S, Klaucke DN, Gibbs NJ. Timeliness of national reporting of communicable diseases: The experience of the national electronic telecommunications system for surveillance. Am J Public Health 1991;81:1313-5.  Back to cited text no. 31
    
32.Silk BJ, Berkelman RL. A review of strategies for enhancing the completeness of notifiable disease reporting. J Public Health Manag Pract 2005;11:191-200.  Back to cited text no. 32
    
33.Jajosky RA, Groseclose SL. Evaluation of reporting timeliness of public health surveillance systems for infectious diseases. BMC Public Health 2004;4:29.  Back to cited text no. 33
    
34.Colizza V, Barrat A, Barthelemy M, Vespignani A. The role of the airline transportation network in the prediction and predictability of global epidemics. Proc Natl Acad Sci 2006;103:2015-20 .  Back to cited text no. 34
    
35.Ferguson NM, Cummings DA, Cauchemez S, Fraser C, Riley S, Meeyai A, et al. Strategies for containing an emerging influenza pandemic in Southeast Asia. Nature 2005;437:209-14.  Back to cited text no. 35
    
36.Dailey L, Watkins RE, Plant AJ. Timeliness of data sources used for influenza surveillance. J Am Med Inform Assoc 2007;14:626-31.  Back to cited text no. 36
    
37.Jackson M, Baer A, Painter I, Duchin J. A simulation study comparing aberration detection algorithms for syndromic surveillance. BMC Med Inform Decis Mak 2007;7:6.  Back to cited text no. 37
    
38.Najera J, Kouznetzsov R, Delacollette C. Malaria epidemics. Detection and control. Forecasting and prevention. Available from: http://www.apps.who.int/malaria/docs/Leysinreport.pdf]. Geneva: World Health Organization; 1998.  Back to cited text no. 38
    
39.Meynard JB, Chaudet H, Texier G, Ardillon V, Ravachol F, Deparis X, et al. Value of syndromic surveillance within the armed forces for early warning during a dengue fever outbreak in French Guiana in 2006. BMC Med Inform Decis Mak 2008;8:29.  Back to cited text no. 39
    
40.Buehler JW, Hopkins RS, Overhage JM, Sosin DM, Tong V. CDC Working Group. Framework for evaluating public health surveillance systems for early detection of outbreaks: Recommendations from the CDC working group. MMWR Recomm Rep 2004;53(RR-5):1-11.  Back to cited text no. 40
    
41.Thacker SB, Choi K, Brachman PS. The surveillance of infectious diseases. JAMA 1983;249:1181-5.  Back to cited text no. 41
    

Top
Correspondence Address:
Rajan R Patil
Health Division UNDP, Odisha; Division of Epidemiology, School of Public Health, SRM University, Chennai
India
Login to access the Email id

Source of Support: This study was carried out by the UNDP Odisha Office and supported by the WHO, as a series of action research studies, to help in the development of a National Program on Multidisease Surveillance, titled, 'Integrated Disease Surveillance Program' (IDSP)., Conflict of Interest: None


DOI: 10.4103/0974-777X.96778

Rights and Permissions



 
 
    Tables

  [Table 1], [Table 2], [Table 3], [Table 4], [Table 5], [Table 6], [Table 7]

This article has been cited by
1 A Review of Data Quality Assessment Methods for Public Health Information Systems
Hong Chen,David Hailey,Ning Wang,Ping Yu
International Journal of Environmental Research and Public Health. 2014; 11(5): 5170
[Pubmed] | [DOI]
2 Mobile Phone–based Syndromic Surveillance System, Papua New Guinea
Alexander Rosewell,Berry Ropa,Heather Randall,Rosheila Dagina,Samuel Hurim,Sibauk Bieb,Siddhartha Datta,Sundar Ramamurthy,Glen Mola,Anthony B. Zwi,Pradeep Ray,C. Raina MacIntyre
Emerging Infectious Diseases. 2013; 19(11): 1811
[Pubmed] | [DOI]



 

Top
  
 
  Search
 
  
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Email Alert *
    Add to My List *
* Registration required (free)  


    Abstract
   Introduction
    Materials and Me...
   Results
   Discussion
   Conclusions
   Acknowledgment
    References
    Article Tables

 Article Access Statistics
    Viewed2670    
    Printed133    
    Emailed2    
    PDF Downloaded18    
    Comments [Add]    
    Cited by others 2    

Recommend this journal

Sitemap | What's New | Feedback | Copyright and Disclaimer | Contact Us
© 2008 Journal of Global Infectious Diseases | Published by Wolters Kluwer - Medknow
Online since 10th December, 2008