• Privacy Policy

Research Method

Home » Evaluating Research – Process, Examples and Methods

Evaluating Research – Process, Examples and Methods

Table of Contents

Evaluating Research

Evaluating Research

Definition:

Evaluating Research refers to the process of assessing the quality, credibility, and relevance of a research study or project. This involves examining the methods, data, and results of the research in order to determine its validity, reliability, and usefulness. Evaluating research can be done by both experts and non-experts in the field, and involves critical thinking, analysis, and interpretation of the research findings.

Research Evaluating Process

The process of evaluating research typically involves the following steps:

Identify the Research Question

The first step in evaluating research is to identify the research question or problem that the study is addressing. This will help you to determine whether the study is relevant to your needs.

Assess the Study Design

The study design refers to the methodology used to conduct the research. You should assess whether the study design is appropriate for the research question and whether it is likely to produce reliable and valid results.

Evaluate the Sample

The sample refers to the group of participants or subjects who are included in the study. You should evaluate whether the sample size is adequate and whether the participants are representative of the population under study.

Review the Data Collection Methods

You should review the data collection methods used in the study to ensure that they are valid and reliable. This includes assessing the measures used to collect data and the procedures used to collect data.

Examine the Statistical Analysis

Statistical analysis refers to the methods used to analyze the data. You should examine whether the statistical analysis is appropriate for the research question and whether it is likely to produce valid and reliable results.

Assess the Conclusions

You should evaluate whether the data support the conclusions drawn from the study and whether they are relevant to the research question.

Consider the Limitations

Finally, you should consider the limitations of the study, including any potential biases or confounding factors that may have influenced the results.

Evaluating Research Methods

Evaluating Research Methods are as follows:

  • Peer review: Peer review is a process where experts in the field review a study before it is published. This helps ensure that the study is accurate, valid, and relevant to the field.
  • Critical appraisal : Critical appraisal involves systematically evaluating a study based on specific criteria. This helps assess the quality of the study and the reliability of the findings.
  • Replication : Replication involves repeating a study to test the validity and reliability of the findings. This can help identify any errors or biases in the original study.
  • Meta-analysis : Meta-analysis is a statistical method that combines the results of multiple studies to provide a more comprehensive understanding of a particular topic. This can help identify patterns or inconsistencies across studies.
  • Consultation with experts : Consulting with experts in the field can provide valuable insights into the quality and relevance of a study. Experts can also help identify potential limitations or biases in the study.
  • Review of funding sources: Examining the funding sources of a study can help identify any potential conflicts of interest or biases that may have influenced the study design or interpretation of results.

Example of Evaluating Research

Example of Evaluating Research sample for students:

Title of the Study: The Effects of Social Media Use on Mental Health among College Students

Sample Size: 500 college students

Sampling Technique : Convenience sampling

  • Sample Size: The sample size of 500 college students is a moderate sample size, which could be considered representative of the college student population. However, it would be more representative if the sample size was larger, or if a random sampling technique was used.
  • Sampling Technique : Convenience sampling is a non-probability sampling technique, which means that the sample may not be representative of the population. This technique may introduce bias into the study since the participants are self-selected and may not be representative of the entire college student population. Therefore, the results of this study may not be generalizable to other populations.
  • Participant Characteristics: The study does not provide any information about the demographic characteristics of the participants, such as age, gender, race, or socioeconomic status. This information is important because social media use and mental health may vary among different demographic groups.
  • Data Collection Method: The study used a self-administered survey to collect data. Self-administered surveys may be subject to response bias and may not accurately reflect participants’ actual behaviors and experiences.
  • Data Analysis: The study used descriptive statistics and regression analysis to analyze the data. Descriptive statistics provide a summary of the data, while regression analysis is used to examine the relationship between two or more variables. However, the study did not provide information about the statistical significance of the results or the effect sizes.

Overall, while the study provides some insights into the relationship between social media use and mental health among college students, the use of a convenience sampling technique and the lack of information about participant characteristics limit the generalizability of the findings. In addition, the use of self-administered surveys may introduce bias into the study, and the lack of information about the statistical significance of the results limits the interpretation of the findings.

Note*: Above mentioned example is just a sample for students. Do not copy and paste directly into your assignment. Kindly do your own research for academic purposes.

Applications of Evaluating Research

Here are some of the applications of evaluating research:

  • Identifying reliable sources : By evaluating research, researchers, students, and other professionals can identify the most reliable sources of information to use in their work. They can determine the quality of research studies, including the methodology, sample size, data analysis, and conclusions.
  • Validating findings: Evaluating research can help to validate findings from previous studies. By examining the methodology and results of a study, researchers can determine if the findings are reliable and if they can be used to inform future research.
  • Identifying knowledge gaps: Evaluating research can also help to identify gaps in current knowledge. By examining the existing literature on a topic, researchers can determine areas where more research is needed, and they can design studies to address these gaps.
  • Improving research quality : Evaluating research can help to improve the quality of future research. By examining the strengths and weaknesses of previous studies, researchers can design better studies and avoid common pitfalls.
  • Informing policy and decision-making : Evaluating research is crucial in informing policy and decision-making in many fields. By examining the evidence base for a particular issue, policymakers can make informed decisions that are supported by the best available evidence.
  • Enhancing education : Evaluating research is essential in enhancing education. Educators can use research findings to improve teaching methods, curriculum development, and student outcomes.

Purpose of Evaluating Research

Here are some of the key purposes of evaluating research:

  • Determine the reliability and validity of research findings : By evaluating research, researchers can determine the quality of the study design, data collection, and analysis. They can determine whether the findings are reliable, valid, and generalizable to other populations.
  • Identify the strengths and weaknesses of research studies: Evaluating research helps to identify the strengths and weaknesses of research studies, including potential biases, confounding factors, and limitations. This information can help researchers to design better studies in the future.
  • Inform evidence-based decision-making: Evaluating research is crucial in informing evidence-based decision-making in many fields, including healthcare, education, and public policy. Policymakers, educators, and clinicians rely on research evidence to make informed decisions.
  • Identify research gaps : By evaluating research, researchers can identify gaps in the existing literature and design studies to address these gaps. This process can help to advance knowledge and improve the quality of research in a particular field.
  • Ensure research ethics and integrity : Evaluating research helps to ensure that research studies are conducted ethically and with integrity. Researchers must adhere to ethical guidelines to protect the welfare and rights of study participants and to maintain the trust of the public.

Characteristics Evaluating Research

Characteristics Evaluating Research are as follows:

  • Research question/hypothesis: A good research question or hypothesis should be clear, concise, and well-defined. It should address a significant problem or issue in the field and be grounded in relevant theory or prior research.
  • Study design: The research design should be appropriate for answering the research question and be clearly described in the study. The study design should also minimize bias and confounding variables.
  • Sampling : The sample should be representative of the population of interest and the sampling method should be appropriate for the research question and study design.
  • Data collection : The data collection methods should be reliable and valid, and the data should be accurately recorded and analyzed.
  • Results : The results should be presented clearly and accurately, and the statistical analysis should be appropriate for the research question and study design.
  • Interpretation of results : The interpretation of the results should be based on the data and not influenced by personal biases or preconceptions.
  • Generalizability: The study findings should be generalizable to the population of interest and relevant to other settings or contexts.
  • Contribution to the field : The study should make a significant contribution to the field and advance our understanding of the research question or issue.

Advantages of Evaluating Research

Evaluating research has several advantages, including:

  • Ensuring accuracy and validity : By evaluating research, we can ensure that the research is accurate, valid, and reliable. This ensures that the findings are trustworthy and can be used to inform decision-making.
  • Identifying gaps in knowledge : Evaluating research can help identify gaps in knowledge and areas where further research is needed. This can guide future research and help build a stronger evidence base.
  • Promoting critical thinking: Evaluating research requires critical thinking skills, which can be applied in other areas of life. By evaluating research, individuals can develop their critical thinking skills and become more discerning consumers of information.
  • Improving the quality of research : Evaluating research can help improve the quality of research by identifying areas where improvements can be made. This can lead to more rigorous research methods and better-quality research.
  • Informing decision-making: By evaluating research, we can make informed decisions based on the evidence. This is particularly important in fields such as medicine and public health, where decisions can have significant consequences.
  • Advancing the field : Evaluating research can help advance the field by identifying new research questions and areas of inquiry. This can lead to the development of new theories and the refinement of existing ones.

Limitations of Evaluating Research

Limitations of Evaluating Research are as follows:

  • Time-consuming: Evaluating research can be time-consuming, particularly if the study is complex or requires specialized knowledge. This can be a barrier for individuals who are not experts in the field or who have limited time.
  • Subjectivity : Evaluating research can be subjective, as different individuals may have different interpretations of the same study. This can lead to inconsistencies in the evaluation process and make it difficult to compare studies.
  • Limited generalizability: The findings of a study may not be generalizable to other populations or contexts. This limits the usefulness of the study and may make it difficult to apply the findings to other settings.
  • Publication bias: Research that does not find significant results may be less likely to be published, which can create a bias in the published literature. This can limit the amount of information available for evaluation.
  • Lack of transparency: Some studies may not provide enough detail about their methods or results, making it difficult to evaluate their quality or validity.
  • Funding bias : Research funded by particular organizations or industries may be biased towards the interests of the funder. This can influence the study design, methods, and interpretation of results.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

What is a Hypothesis

What is a Hypothesis – Types, Examples and...

Scope of the Research

Scope of the Research – Writing Guide and...

Research Techniques

Research Techniques – Methods, Types and Examples

Research Questions

Research Questions – Types, Examples and Writing...

Thesis

Thesis – Structure, Example and Writing Guide

Background of The Study

Background of The Study – Examples and Writing...

  • Search Menu
  • Sign in through your institution
  • Advance articles
  • Author Guidelines
  • Submission Site
  • Open Access
  • Why Publish?
  • About Science and Public Policy
  • Editorial Board
  • Advertising and Corporate Services
  • Journals Career Network
  • Self-Archiving Policy
  • Dispatch Dates
  • Journals on Oxford Academic
  • Books on Oxford Academic

Issue Cover

Article Contents

1. introduction, 2. background, 4. findings, 5. discussion, 6. conclusion and final remarks, supplementary material, data availability, conflict of interest statement., acknowledgements.

  • < Previous

Evaluation of research proposals by peer review panels: broader panels for broader assessments?

ORCID logo

  • Article contents
  • Figures & tables
  • Supplementary Data

Rebecca Abma-Schouten, Joey Gijbels, Wendy Reijmerink, Ingeborg Meijer, Evaluation of research proposals by peer review panels: broader panels for broader assessments?, Science and Public Policy , Volume 50, Issue 4, August 2023, Pages 619–632, https://doi.org/10.1093/scipol/scad009

  • Permissions Icon Permissions

Panel peer review is widely used to decide which research proposals receive funding. Through this exploratory observational study at two large biomedical and health research funders in the Netherlands, we gain insight into how scientific quality and societal relevance are discussed in panel meetings. We explore, in ten review panel meetings of biomedical and health funding programmes, how panel composition and formal assessment criteria affect the arguments used. We observe that more scientific arguments are used than arguments related to societal relevance and expected impact. Also, more diverse panels result in a wider range of arguments, largely for the benefit of arguments related to societal relevance and impact. We discuss how funders can contribute to the quality of peer review by creating a shared conceptual framework that better defines research quality and societal relevance. We also contribute to a further understanding of the role of diverse peer review panels.

Scientific biomedical and health research is often supported by project or programme grants from public funding agencies such as governmental research funders and charities. Research funders primarily rely on peer review, often a combination of independent written review and discussion in a peer review panel, to inform their funding decisions. Peer review panels have the difficult task of integrating and balancing the various assessment criteria to select and rank the eligible proposals. With the increasing emphasis on societal benefit and being responsive to societal needs, the assessment of research proposals ought to include broader assessment criteria, including both scientific quality and societal relevance, and a broader perspective on relevant peers. This results in new practices of including non-scientific peers in review panels ( Del Carmen Calatrava Moreno et al. 2019 ; Den Oudendammer et al. 2019 ; Van den Brink et al. 2016 ). Relevant peers, in the context of biomedical and health research, include, for example, health-care professionals, (healthcare) policymakers, and patients as the (end-)users of research.

Currently, in scientific and grey literature, much attention is paid to what legitimate criteria are and to deficiencies in the peer review process, for example, focusing on the role of chance and the difficulty of assessing interdisciplinary or ‘blue sky’ research ( Langfeldt 2006 ; Roumbanis 2021a ). Our research primarily builds upon the work of Lamont (2009) , Huutoniemi (2012) , and Kolarz et al. (2016) . Their work articulates how the discourse in peer review panels can be understood by giving insight into disciplinary assessment cultures and social dynamics, as well as how panel members define and value concepts such as scientific excellence, interdisciplinarity, and societal impact. At the same time, there is little empirical work on what actually is discussed in peer review meetings and to what extent this is related to the specific objectives of the research funding programme. Such observational work is especially lacking in the biomedical and health domain.

The aim of our exploratory study is to learn what arguments panel members use in a review meeting when assessing research proposals in biomedical and health research programmes. We explore how arguments used in peer review panels are affected by (1) the formal assessment criteria and (2) the inclusion of non-scientific peers in review panels, also called (end-)users of research, societal stakeholders, or societal actors. We add to the existing literature by focusing on the actual arguments used in peer review assessment in practice.

To this end, we observed ten panel meetings in a variety of eight biomedical and health research programmes at two large research funders in the Netherlands: the governmental research funder The Netherlands Organisation for Health Research and Development (ZonMw) and the charitable research funder the Dutch Heart Foundation (DHF). Our first research question focuses on what arguments panel members use when assessing research proposals in a review meeting. The second examines to what extent these arguments correspond with the formal −as described in the programme brochure and assessment form− criteria on scientific quality and societal impact creation. The third question focuses on how arguments used differ between panel members with different perspectives.

2.1 Relation between science and society

To understand the dual focus of scientific quality and societal relevance in research funding, a theoretical understanding and a practical operationalisation of the relation between science and society are needed. The conceptualisation of this relationship affects both who are perceived as relevant peers in the review process and the criteria by which research proposals are assessed.

The relationship between science and society is not constant over time nor static, yet a relation that is much debated. Scientific knowledge can have a huge impact on societies, either intended or unintended. Vice versa, the social environment and structure in which science takes place influence the rate of development, the topics of interest, and the content of science. However, the second part of this inter-relatedness between science and society generally receives less attention ( Merton 1968 ; Weingart 1999 ).

From a historical perspective, scientific and technological progress contributed to the view that science was valuable on its own account and that science and the scientist stood independent of society. While this protected science from unwarranted political influence, societal disengagement with science resulted in less authority by science and debate about its contribution to society. This interdependence and mutual influence contributed to a modern view of science in which knowledge development is valued both on its own merit and for its impact on, and interaction with, society. As such, societal factors and problems are important drivers for scientific research. This warrants that the relation and boundaries between science, society, and politics need to be organised and constantly reinforced and reiterated ( Merton 1968 ; Shapin 2008 ; Weingart 1999 ).

Glerup and Horst (2014) conceptualise the value of science to society and the role of society in science in four rationalities that reflect different justifications for their relation and thus also for who is responsible for (assessing) the societal value of science. The rationalities are arranged along two axes: one is related to the internal or external regulation of science and the other is related to either the process or the outcome of science as the object of steering. The first two rationalities of Reflexivity and Demarcation focus on internal regulation in the scientific community. Reflexivity focuses on the outcome. Central is that science, and thus, scientists should learn from societal problems and provide solutions. Demarcation focuses on the process: science should continuously question its own motives and methods. The latter two rationalities of Contribution and Integration focus on external regulation. The core of the outcome-oriented Contribution rationality is that scientists do not necessarily see themselves as ‘working for the public good’. Science should thus be regulated by society to ensure that outcomes are useful. The central idea of the process-oriented Integration rationality is that societal actors should be involved in science in order to influence the direction of research.

Research funders can be seen as external or societal regulators of science. They can focus on organising the process of science, Integration, or on scientific outcomes that function as solutions for societal challenges, Contribution. In the Contribution perspective, a funder could enhance outside (societal) involvement in science to ensure that scientists take responsibility to deliver results that are needed and used by society. From Integration follows that actors from science and society need to work together in order to produce the best results. In this perspective, there is a lack of integration between science and society and more collaboration and dialogue are needed to develop a new kind of integrative responsibility ( Glerup and Horst 2014 ). This argues for the inclusion of other types of evaluators in research assessment. In reality, these rationalities are not mutually exclusive and also not strictly separated. As a consequence, multiple rationalities can be recognised in the reasoning of scientists and in the policies of research funders today.

2.2 Criteria for research quality and societal relevance

The rationalities of Glerup and Horst have consequences for which language is used to discuss societal relevance and impact in research proposals. Even though the main ingredients are quite similar, as a consequence of the coexisting rationalities in science, societal aspects can be defined and operationalised in different ways ( Alla et al. 2017 ). In the definition of societal impact by Reed, emphasis is placed on the outcome : the contribution to society. It includes the significance for society, the size of potential impact, and the reach , the number of people or organisations benefiting from the expected outcomes ( Reed et al. 2021 ). Other models and definitions focus more on the process of science and its interaction with society. Spaapen and Van Drooge introduced productive interactions in the assessment of societal impact, highlighting a direct contact between researchers and other actors. A key idea is that the interaction in different domains leads to impact in different domains ( Meijer 2012 ; Spaapen and Van Drooge 2011 ). Definitions that focus on the process often refer to societal impact as (1) something that can take place in distinguishable societal domains, (2) something that needs to be actively pursued, and (3) something that requires interactions with societal stakeholders (or users of research) ( Hughes and Kitson 2012 ; Spaapen and Van Drooge 2011 ).

Glerup and Horst show that process and outcome-oriented aspects can be combined in the operationalisation of criteria for assessing research proposals on societal aspects. Also, the funders participating in this study include the outcome—the value created in different domains—and the process—productive interactions with stakeholders—in their formal assessment criteria for societal relevance and impact. Different labels are used for these criteria, such as societal relevance , societal quality , and societal impact ( Abma-Schouten 2017 ; Reijmerink and Oortwijn 2017 ). In this paper, we use societal relevance or societal relevance and impact .

Scientific quality in research assessment frequently refers to all aspects and activities in the study that contribute to the validity and reliability of the research results and that contribute to the integrity and quality of the research process itself. The criteria commonly include the relevance of the proposal for the funding programme, the scientific relevance, originality, innovativeness, methodology, and feasibility ( Abdoul et al. 2012 ). Several studies demonstrated that quality is seen as not only a rich concept but also a complex concept in which excellence and innovativeness, methodological aspects, engagement of stakeholders, multidisciplinary collaboration, and societal relevance all play a role ( Geurts 2016 ; Roumbanis 2019 ; Scholten et al. 2018 ). Another study showed a comprehensive definition of ‘good’ science, which includes creativity, reproducibility, perseverance, intellectual courage, and personal integrity. It demonstrated that ‘good’ science involves not only scientific excellence but also personal values and ethics, and engagement with society ( Van den Brink et al. 2016 ). Noticeable in these studies is the connection made between societal relevance and scientific quality.

In summary, the criteria for scientific quality and societal relevance are conceptualised in different ways, and perspectives on the role of societal value creation and the involvement of societal actors vary strongly. Research funders hence have to pay attention to the meaning of the criteria for the panel members they recruit to help them, and navigate and negotiate how the criteria are applied in assessing research proposals. To be able to do so, more insight is needed in which elements of scientific quality and societal relevance are discussed in practice by peer review panels.

2.3 Role of funders and societal actors in peer review

National governments and charities are important funders of biomedical and health research. How this funding is distributed varies per country. Project funding is frequently allocated based on research programming by specialised public funding organisations, such as the Dutch Research Council in the Netherlands and ZonMw for health research. The DHF, the second largest private non-profit research funder in the Netherlands, provides project funding ( Private Non-Profit Financiering 2020 ). Funders, as so-called boundary organisations, can act as key intermediaries between government, science, and society ( Jasanoff 2011 ). Their responsibility is to develop effective research policies connecting societal demands and scientific ‘supply’. This includes setting up and executing fair and balanced assessment procedures ( Sarewitz and Pielke 2007 ). Herein, the role of societal stakeholders is receiving increasing attention ( Benedictus et al. 2016 ; De Rijcke et al. 2016 ; Dijstelbloem et al. 2013 ; Scholten et al. 2018 ).

All charitable health research funders in the Netherlands have, in the last decade, included patients at different stages of the funding process, including in assessing research proposals ( Den Oudendammer et al. 2019 ). To facilitate research funders in involving patients in assessing research proposals, the federation of Dutch patient organisations set up an independent reviewer panel with (at-risk) patients and direct caregivers ( Patiëntenfederatie Nederland, n.d .). Other foundations have set up societal advisory panels including a wider range of societal actors than patients alone. The Committee Societal Quality (CSQ) of the DHF includes, for example, (at-risk) patients and a wide range of cardiovascular health-care professionals who are not active as academic researchers. This model is also applied by the Diabetes Foundation and the Princess Beatrix Muscle Foundation in the Netherlands ( Diabetesfonds, n.d .; Prinses Beatrix Spierfonds, n.d .).

In 2014, the Lancet presented a series of five papers about biomedical and health research known as the ‘increasing value, reducing waste’ series ( Macleod et al. 2014 ). The authors addressed several issues as well as potential solutions that funders can implement. They highlight, among others, the importance of improving the societal relevance of the research questions and including the burden of disease in research assessment in order to increase the value of biomedical and health science for society. A better understanding of and an increasing role of users of research are also part of the described solutions ( Chalmers et al. 2014 ; Van den Brink et al. 2016 ). This is also in line with the recommendations of the 2013 Declaration on Research Assessment (DORA) ( DORA 2013 ). These recommendations influence the way in which research funders operationalise their criteria in research assessment, how they balance the judgement of scientific and societal aspects, and how they involve societal stakeholders in peer review.

2.4 Panel peer review of research proposals

To assess research proposals, funders rely on the services of peer experts to review the thousands or perhaps millions of research proposals seeking funding each year. While often associated with scholarly publishing, peer review also includes the ex ante assessment of research grant and fellowship applications ( Abdoul et al. 2012 ). Peer review of proposals often includes a written assessment of a proposal by an anonymous peer and a peer review panel meeting to select the proposals eligible for funding. Peer review is an established component of professional academic practice, is deeply embedded in the research culture, and essentially consists of experts in a given domain appraising the professional performance, creativity, and/or quality of scientific work produced by others in their field of competence ( Demicheli and Di Pietrantonj 2007 ). The history of peer review as the default approach for scientific evaluation and accountability is, however, relatively young. While the term was unheard of in the 1960s, by 1970, it had become the standard. Since that time, peer review has become increasingly diverse and formalised, resulting in more public accountability ( Reinhart and Schendzielorz 2021 ).

While many studies have been conducted concerning peer review in scholarly publishing, peer review in grant allocation processes has been less discussed ( Demicheli and Di Pietrantonj 2007 ). The most extensive work on this topic has been conducted by Lamont (2009) . Lamont studied peer review panels in five American research funding organisations, including observing three panels. Other examples include Roumbanis’s ethnographic observations of ten review panels at the Swedish Research Council in natural and engineering sciences ( Roumbanis 2017 , 2021a ). Also, Huutoniemi was able to study, but not observe, four panels on environmental studies and social sciences of the Academy of Finland ( Huutoniemi 2012 ). Additionally, Van Arensbergen and Van den Besselaar (2012) analysed peer review through interviews and by analysing the scores and outcomes at different stages of the peer review process in a talent funding programme. In particular, interesting is the study by Luo and colleagues on 164 written panel review reports, showing that the reviews from panels that included non-scientific peers described broader and more concrete impact topics. Mixed panels also more often connected research processes and characteristics of applicants with impact creation ( Luo et al. 2021 ).

While these studies primarily focused on peer review panels in other disciplinary domains or are based on interviews or reports instead of direct observations, we believe that many of the findings are relevant to the functioning of panels in the context of biomedical and health research. From this literature, we learn to have realistic expectations of peer review. It is inherently difficult to predict in advance which research projects will provide the most important findings or breakthroughs ( Lee et al. 2013 ; Pier et al. 2018 ; Roumbanis 2021a , 2021b ). At the same time, these limitations may not substantiate the replacement of peer review by another assessment approach ( Wessely 1998 ). Many topics addressed in the literature are inter-related and relevant to our study, such as disciplinary differences and interdisciplinarity, social dynamics and their consequences for consistency and bias, and suggestions to improve panel peer review ( Lamont and Huutoniemi 2011 ; Lee et al. 2013 ; Pier et al. 2018 ; Roumbanis 2021a , b ; Wessely 1998 ).

Different scientific disciplines show different preferences and beliefs about how to build knowledge and thus have different perceptions of excellence. However, panellists are willing to respect and acknowledge other standards of excellence ( Lamont 2009 ). Evaluation cultures also differ between scientific fields. Science, technology, engineering, and mathematics panels might, in comparison with panellists from social sciences and humanities, be more concerned with the consistency of the assessment across panels and therefore with clear definitions and uses of assessment criteria ( Lamont and Huutoniemi 2011 ). However, much is still to learn about how panellists’ cognitive affiliations with particular disciplines unfold in the evaluation process. Therefore, the assessment of interdisciplinary research is much more complex than just improving the criteria or procedure because less explicit repertoires would also need to change ( Huutoniemi 2012 ).

Social dynamics play a role as panellists may differ in their motivation to engage in allocation processes, which could create bias ( Lee et al. 2013 ). Placing emphasis on meeting established standards or thoroughness in peer review may promote uncontroversial and safe projects, especially in a situation where strong competition puts pressure on experts to reach a consensus ( Langfeldt 2001 ,2006 ). Personal interest and cognitive similarity may also contribute to conservative bias, which could negatively affect controversial or frontier science ( Luukkonen 2012 ; Roumbanis 2021a ; Travis and Collins 1991 ). Central in this part of literature is that panel conclusions are the outcome of and are influenced by the group interaction ( Van Arensbergen et al. 2014a ). Differences in, for example, the status and expertise of the panel members can play an important role in group dynamics. Insights from social psychology on group dynamics can help in understanding and avoiding bias in peer review panels ( Olbrecht and Bornmann 2010 ). For example, group performance research shows that more diverse groups with complementary skills make better group decisions than homogenous groups. Yet, heterogeneity can also increase conflict within the group ( Forsyth 1999 ). Therefore, it is important to pay attention to power dynamics and maintain team spirit and good communication ( Van Arensbergen et al. 2014a ), especially in meetings that include both scientific and non-scientific peers.

The literature also provides funders with starting points to improve the peer review process. For example, the explicitness of review procedures positively influences the decision-making processes ( Langfeldt 2001 ). Strategic voting and decision-making appear to be less frequent in panels that rate than in panels that rank proposals. Also, an advisory instead of a decisional role may improve the quality of the panel assessment ( Lamont and Huutoniemi 2011 ).

Despite different disciplinary evaluative cultures, formal procedures, and criteria, panel members with different backgrounds develop shared customary rules of deliberation that facilitate agreement and help avoid situations of conflict ( Huutoniemi 2012 ; Lamont 2009 ). This is a necessary prerequisite for opening up peer review panels to include non-academic experts. When doing so, it is important to realise that panel review is a social, emotional, and interactional process. It is therefore important to also take these non-cognitive aspects into account when studying cognitive aspects ( Lamont and Guetzkow 2016 ), as we do in this study.

In summary, what we learn from the literature is that (1) the specific criteria to operationalise scientific quality and societal relevance of research are important, (2) the rationalities from Glerup and Horst predict that not everyone values societal aspects and involve non-scientists in peer review to the same extent and in the same way, (3) this may affect the way peer review panels discuss these aspects, and (4) peer review is a challenging group process that could accommodate other rationalities in order to prevent bias towards specific scientific criteria. To disentangle these aspects, we have carried out an observational study of a diverse range of peer review panel sessions using a fixed set of criteria focusing on scientific quality and societal relevance.

3.1 Research assessment at ZonMw and the DHF

The peer review approach and the criteria used by both the DHF and ZonMw are largely comparable. Funding programmes at both organisations start with a brochure describing the purposes, goals, and conditions for research applications, as well as the assessment procedure and criteria. Both organisations apply a two-stage process. In the first phase, reviewers are asked to write a peer review. In the second phase, a panel reviews the application based on the advice of the written reviews and the applicants’ rebuttal. The panels advise the board on eligible proposals for funding including a ranking of these proposals.

There are also differences between the two organisations. At ZonMw, the criteria for societal relevance and quality are operationalised in the ZonMw Framework Fostering Responsible Research Practices ( Reijmerink and Oortwijn 2017 ). This contributes to a common operationalisation of both quality and societal relevance on the level of individual funding programmes. Important elements in the criteria for societal relevance are, for instance, stakeholder participation, (applying) holistic health concepts, and the added value of knowledge in practice, policy, and education. The framework was developed to optimise the funding process from the perspective of knowledge utilisation and includes concepts like productive interactions and Open Science. It is part of the ZonMw Impact Assessment Framework aimed at guiding the planning, monitoring, and evaluation of funding programmes ( Reijmerink et al. 2020 ). At ZonMw, interdisciplinary panels are set up specifically for each funding programme. Panels are interdisciplinary in nature with academics of a wide range of disciplines and often include non-academic peers, like policymakers, health-care professionals, and patients.

At the DHF, the criteria for scientific quality and societal relevance, at the DHF called societal impact , find their origin in the strategy report of the advisory committee CardioVascular Research Netherlands ( Reneman et al. 2010 ). This report forms the basis of the DHF research policy focusing on scientific and societal impact by creating national collaborations in thematic, interdisciplinary research programmes (the so-called consortia) connecting preclinical and clinical expertise into one concerted effort. An International Scientific Advisory Committee (ISAC) was established to assess these thematic consortia. This panel consists of international scientists, primarily with expertise in the broad cardiovascular research field. The DHF criteria for societal impact were redeveloped in 2013 in collaboration with their CSQ. This panel assesses and advises on the societal aspects of proposed studies. The societal impact criteria include the relevance of the health-care problem, the expected contribution to a solution, attention to the next step in science and towards implementation in practice, and the involvement of and interaction with (end-)users of research (R.Y. Abma-Schouten and I.M. Meijer, unpublished data). Peer review panels for consortium funding are generally composed of members of the ISAC, members of the CSQ, and ad hoc panel members relevant to the specific programme. CSQ members often have a pre-meeting before the final panel meetings to prepare and empower CSQ representatives participating in the peer review panel.

3.2 Selection of funding programmes

To compare and evaluate observations between the two organisations, we selected funding programmes that were relatively comparable in scope and aims. The criteria were (1) a translational and/or clinical objective and (2) the selection procedure consisted of review panels that were responsible for the (final) relevance and quality assessment of grant applications. In total, we selected eight programmes: four at each organisation. At the DHF, two programmes were chosen in which the CSQ did not participate to better disentangle the role of the panel composition. For each programme, we observed the selection process varying from one session on one day (taking 2–8 h) to multiple sessions over several days. Ten sessions were observed in total, of which eight were final peer review panel meetings and two were CSQ meetings preparing for the panel meeting.

After management approval for the study in both organisations, we asked programme managers and panel chairpersons of the programmes that were selected for their consent for observation; none refused participation. Panel members were, in a passive consent procedure, informed about the planned observation and anonymous analyses.

To ensure the independence of this evaluation, the selection of the grant programmes, and peer review panels observed, was at the discretion of the project team of this study. The observations and supervision of the analyses were performed by the senior author not affiliated with the funders.

3.3 Observation matrix

Given the lack of a common operationalisation for scientific quality and societal relevance, we decided to use an observation matrix with a fixed set of detailed aspects as a gold standard to score the brochures, the assessment forms, and the arguments used in panel meetings. The matrix used for the observations of the review panels was based upon and adapted from a ‘grant committee observation matrix’ developed by Van Arensbergen. The original matrix informed a literature review on the selection of talent through peer review and the social dynamics in grant review committees ( van Arensbergen et al. 2014b ). The matrix includes four categories of aspects that operationalise societal relevance, scientific quality, committee, and applicant (see  Table 1 ). The aspects of scientific quality and societal relevance were adapted to fit the operationalisation of scientific quality and societal relevance of the organisations involved. The aspects concerning societal relevance were derived from the CSQ criteria, and the aspects concerning scientific quality were based on the scientific criteria of the first panel observed. The four argument types related to the panel were kept as they were. This committee-related category reflects statements that are related to the personal experience or preference of a panel member and can be seen as signals for bias. This category also includes statements that compare a project with another project without further substantiation. The three applicant-related arguments in the original observation matrix were extended with a fourth on social skills in communication with society. We added health technology assessment (HTA) because one programme specifically focused on this aspect. We tested our version of the observation matrix in pilot observations.

Aspects included in the observation matrix and examples of arguments.

Short title of aspects in the observation matrixExamples of arguments
Criterion: scientific quality
Fit in programme objectives‘This disease is underdiagnosed, and undertreated, and therefore fits the criteria of this call very well.’
‘Might have a relevant impact on patient care, but to what extent does it align with the aims of this programme.’
Match science and health-care problem‘It is not properly compared to the current situation (standard of care).’
‘Super relevant application with a fitting plan, perhaps a little too mechanistic.’
International competitiveness‘Something is done all over the world, but they do many more evaluations, however.’
Feasibility of the aims‘… because this is a discovery study the power calculation is difficult, but I would recommend to increase the sample size.’
‘It’s very risky, because this is an exploratory … study without hypotheses.’
‘The aim is to improve …, but there is no control to compare with.’
‘Well substantiated that they are able to achieve the objectives.’
Plan of work‘Will there be enough cases in this cohort?’
‘The budget is no longer correct.’
‘Plan is good, but … doubts about the approach, because too little information….’
Criterion: societal relevance
Health-care problem‘Relevant problem for a small group.’
‘… but is this a serious health condition?’
‘Prevalence is low, but patients do die, morbidity is very high.’
Contribution to solution‘What will this add since we already do…?’
‘It is unclear what the intervention will be after the diagnosis.’
‘Relevance is not good. Side effects are not known and neither is effectiveness.’
Next step in science‘What is needed to go from this retrospective study towards implementation?’
‘It’s not clear whether that work package is necessary or “nice to have”.’
‘Knowledge utilisation paragraph is standard, as used by copywriters.’
Activities towards partners‘What do the applicants do to change the current practice?’
‘Important that the company also contributes financially to the further development.’
‘This proposal includes a good communication plan.’
Participation/diversity‘A user committee is described, but it isn’t well thought through: what is their role?’
‘It’s also important to invite relatives of patients to participate.’
‘They thought really well what their patient group can contribute to the study plan.’
Applicant-related aspects
Scientific publication applicant‘One project leader only has one original paper, …, focus more on other diseases.’
‘Publication output not excellent. Conference papers and posters of local meetings, CV not so strong.’
Background applicant‘… not enough with this expertise involved in the leadership.’
‘Very good CV, … has won many awards.’
‘Candidate is excellent, top 10 to 20 in this field….’
Reputation applicant‘… the main applicant is a hotshot in this field.’
‘Candidate leads cohorts as …, gets a no.’
Societal skills‘Impressed that they took my question seriously, that made my day.’
‘They were very honest about overoptimism in the proposal.’
‘Good group, but they seem quite aware of their own brilliance.’
HTA
HTA‘Concrete revenues are negative, however improvement in quality-adjusted life years but very shaky.’
Committee-related aspects
Personal experience with the applicant‘This researcher only wants to acquire knowledge, nothing further.’
‘I reviewed him before and he is not very good at interviews.’
Personal/unasserted preference‘Excellent presentation, much better than the application.’ (Without further elaboration)
‘This academic lab has advantages, but also disadvantages with regard to independence.’
‘If it can be done anywhere, it is in this group.’
Relation with applicants’ institute/network‘May come up with new models, they’re linked with a group in … who can do this very well.’
Comparison with other applications‘What is the relevance compared to the other proposal? They do something similar.’
‘Look at the proposals as a whole, portfolio, we have clinical and we have fundamental.’
Short title of aspects in the observation matrixExamples of arguments
Criterion: scientific quality
Fit in programme objectives‘This disease is underdiagnosed, and undertreated, and therefore fits the criteria of this call very well.’
‘Might have a relevant impact on patient care, but to what extent does it align with the aims of this programme.’
Match science and health-care problem‘It is not properly compared to the current situation (standard of care).’
‘Super relevant application with a fitting plan, perhaps a little too mechanistic.’
International competitiveness‘Something is done all over the world, but they do many more evaluations, however.’
Feasibility of the aims‘… because this is a discovery study the power calculation is difficult, but I would recommend to increase the sample size.’
‘It’s very risky, because this is an exploratory … study without hypotheses.’
‘The aim is to improve …, but there is no control to compare with.’
‘Well substantiated that they are able to achieve the objectives.’
Plan of work‘Will there be enough cases in this cohort?’
‘The budget is no longer correct.’
‘Plan is good, but … doubts about the approach, because too little information….’
Criterion: societal relevance
Health-care problem‘Relevant problem for a small group.’
‘… but is this a serious health condition?’
‘Prevalence is low, but patients do die, morbidity is very high.’
Contribution to solution‘What will this add since we already do…?’
‘It is unclear what the intervention will be after the diagnosis.’
‘Relevance is not good. Side effects are not known and neither is effectiveness.’
Next step in science‘What is needed to go from this retrospective study towards implementation?’
‘It’s not clear whether that work package is necessary or “nice to have”.’
‘Knowledge utilisation paragraph is standard, as used by copywriters.’
Activities towards partners‘What do the applicants do to change the current practice?’
‘Important that the company also contributes financially to the further development.’
‘This proposal includes a good communication plan.’
Participation/diversity‘A user committee is described, but it isn’t well thought through: what is their role?’
‘It’s also important to invite relatives of patients to participate.’
‘They thought really well what their patient group can contribute to the study plan.’
Applicant-related aspects
Scientific publication applicant‘One project leader only has one original paper, …, focus more on other diseases.’
‘Publication output not excellent. Conference papers and posters of local meetings, CV not so strong.’
Background applicant‘… not enough with this expertise involved in the leadership.’
‘Very good CV, … has won many awards.’
‘Candidate is excellent, top 10 to 20 in this field….’
Reputation applicant‘… the main applicant is a hotshot in this field.’
‘Candidate leads cohorts as …, gets a no.’
Societal skills‘Impressed that they took my question seriously, that made my day.’
‘They were very honest about overoptimism in the proposal.’
‘Good group, but they seem quite aware of their own brilliance.’
HTA
HTA‘Concrete revenues are negative, however improvement in quality-adjusted life years but very shaky.’
Committee-related aspects
Personal experience with the applicant‘This researcher only wants to acquire knowledge, nothing further.’
‘I reviewed him before and he is not very good at interviews.’
Personal/unasserted preference‘Excellent presentation, much better than the application.’ (Without further elaboration)
‘This academic lab has advantages, but also disadvantages with regard to independence.’
‘If it can be done anywhere, it is in this group.’
Relation with applicants’ institute/network‘May come up with new models, they’re linked with a group in … who can do this very well.’
Comparison with other applications‘What is the relevance compared to the other proposal? They do something similar.’
‘Look at the proposals as a whole, portfolio, we have clinical and we have fundamental.’

3.4 Observations

Data were primarily collected through observations. Our observations of review panel meetings were non-participatory: the observer and goal of the observation were introduced at the start of the meeting, without further interactions during the meeting. To aid in the processing of observations, some meetings were audiotaped (sound only). Presentations or responses of applicants were not noted and were not part of the analysis. The observer made notes on the ongoing discussion and scored the arguments while listening. One meeting was not attended in person and only observed and scored by listening to the audiotape recording. Because this made identification of the panel members unreliable, this panel meeting was excluded from the analysis of the third research question on how arguments used differ between panel members with different perspectives.

3.5 Grant programmes and the assessment criteria

We gathered and analysed all brochures and assessment forms used by the review panels in order to answer our second research question on the correspondence of arguments used with the formal criteria. Several programmes consisted of multiple grant calls: in that case, the specific call brochure was gathered and analysed, not the overall programme brochure. Additional documentation (e.g. instructional presentations at the start of the panel meeting) was not included in the document analysis. All included documents were marked using the aforementioned observation matrix. The panel-related arguments were not used because this category reflects the personal arguments of panel members that are not part of brochures or instructions. To avoid potential differences in scoring methods, two of the authors independently scored half of the documents that were checked and validated afterwards by the other. Differences were discussed until a consensus was reached.

3.6 Panel composition

In order to answer the third research question, background information on panel members was collected. We categorised the panel members into five common types of panel members: scientific, clinical scientific, health-care professional/clinical, patient, and policy. First, a list of all panel members was composed including their scientific and professional backgrounds and affiliations. The theoretical notion that reviewers represent different types of users of research and therefore potential impact domains (academic, social, economic, and cultural) was leading in the categorisation ( Meijer 2012 ; Spaapen and Van Drooge 2011 ). Because clinical researchers play a dual role in both advancing research as a fellow academic and as a user of the research output in health-care practice, we divided the academic members into two categories of non-clinical and clinical researchers. Multiple types of professional actors participated in each review panel. These were divided into two groups for the analysis: health-care professionals (without current academic activity) and policymakers in the health-care sector. No representatives of the private sector participated in the observed review panels. From the public domain, (at-risk) patients and patient representatives were part of several review panels. Only publicly available information was used to classify the panel members. Members were assigned to one category only: categorisation took place based on the specific role and expertise for which they were appointed to the panel.

In two of the four DHF programmes, the assessment procedure included the CSQ. In these two programmes, representatives of this CSQ participated in the scientific panel to articulate the findings of the CSQ meeting during the final assessment meeting. Two grant programmes were assessed by a review panel with solely (clinical) scientific members.

3.7 Analysis

Data were processed using ATLAS.ti 8 and Microsoft Excel 2010 to produce descriptive statistics. All observed arguments were coded and given a randomised identification code for the panel member using that particular argument. The number of times an argument type was observed was used as an indicator for the relative importance of that argument in the appraisal of proposals. With this approach, a practical and reproducible method for research funders to evaluate the effect of policy changes on peer review was developed. If codes or notes were unclear, post-observation validation of codes was carried out based on observation matrix notes. Arguments that were noted by the observer but could not be matched with an existing code were first coded as a ‘non-existing’ code, and these were resolved by listening back to the audiotapes. Arguments that could not be assigned to a panel member were assigned a ‘missing panel member’ code. A total of 4.7 per cent of all codes were assigned a ‘missing panel member’ code.

After the analyses, two meetings were held to reflect on the results: one with the CSQ and the other with the programme coordinators of both organisations. The goal of these meetings was to improve our interpretation of the findings, disseminate the results derived from this project, and identify topics for further analyses or future studies.

3.8 Limitations

Our study focuses on studying the final phase of the peer review process of research applications in a real-life setting. Our design, a non-participant observation of peer review panels, also introduced several challenges ( Liu and Maitlis 2010 ).

First, the independent review phase or pre-application phase was not part of our study. We therefore could not assess to what extent attention to certain aspects of scientific quality or societal relevance and impact in the review phase influenced the topics discussed during the meeting.

Second, the most important challenge of overt non-participant observations is the observer effect: the danger of causing reactivity in those under study. We believe that the consequences of this effect on our conclusions were limited because panellists are used to external observers in the meetings of these two funders. The observer briefly explained the goal of the study during the introductory round of the panel in general terms. The observer sat as unobtrusively as possible and avoided reactivity to discussions. Similar to previous observations of panels, we experienced that the fact that an observer was present faded into the background during a meeting ( Roumbanis 2021a ). However, a limited observer effect can never be entirely excluded.

Third, our design to only score the arguments raised, and not the responses of the applicant, or information on the content of the proposals, has its positives and negatives. With this approach, we could assure the anonymity of the grant procedures reviewed, the applicants and proposals, panels, and individual panellists. This was an important condition for the funders involved. We took the frequency arguments used as a proxy for the relative importance of that argument in decision-making, which undeniably also has its caveats. Our data collection approach limits more in-depth reflection on which arguments were decisive in decision-making and on group dynamics during the interaction with the applicants as non-verbal and non-content-related comments were not captured in this study.

Fourth, despite this being one of the largest observational studies on the peer review assessment of grant applications with the observation of ten panels in eight grant programmes, many variables might explain differences in arguments used within and beyond our view. Examples of ‘confounding’ variables are the many variations in panel composition, the differences in objectives of the programmes, and the range of the funding programmes. Our study should therefore be seen as exploratory and thus warrants caution in drawing conclusions.

4.1 Overview of observational data

The grant programmes included in this study reflected a broad range of biomedical and health funding programmes, ranging from fellowship grants to translational research and applied health research. All formal documents available to the applicants and to the review panel were retrieved for both ZonMw and the DHF. In total, eighteen documents corresponding to the eight grant programmes were studied. The number of proposals assessed per programme varied from three to thirty-three. The duration of the panel meetings varied between 2 h and two consecutive days. Together, this resulted in a large spread in the number of total arguments used in an individual meeting and in a grant programme as a whole. In the shortest meeting, 49 arguments were observed versus 254 in the longest, with a mean of 126 arguments per meeting and on average 15 arguments per proposal.

We found consistency between how criteria were operationalised in the grant programme’s brochures and in the assessment forms of the review panels overall. At the same time, because the number of elements included in the observation matrix is limited, there was a considerable diversity in the arguments that fall within each aspect (see examples in  Table 1 ). Some of these differences could possibly be explained by differences in language used and the level of detail in the observation matrix, the brochure, and the panel’s instructions. This was especially the case in the applicant-related aspects in which the observation matrix was more detailed than the text in the brochure and assessment forms.

In interpretating our findings, it is important to take into account that, even though our data were largely complete and the observation matrix matched well with the description of the criteria in the brochures and assessment forms, there was a large diversity in the type and number of arguments used and in the number of proposals assessed in the grant programmes included in our study.

4.2 Wide range of arguments used by panels: scientific arguments used most

For our first research question, we explored the number and type of arguments used in the panel meetings. Figure 1 provides an overview of the arguments used. Scientific quality was discussed most. The number of times the feasibility of the aims was discussed clearly stands out in comparison to all other arguments. Also, the match between the science and the problem studied and the plan of work were frequently discussed aspects of scientific quality. International competitiveness of the proposal was discussed the least of all five scientific arguments.

The number of arguments used in panel meetings.

The number of arguments used in panel meetings.

Attention was paid to societal relevance and impact in the panel meetings of both organisations. Yet, the language used differed somewhat between organisations. The contribution to a solution and the next step in science were the most often used societal arguments. At ZonMw, the impact of the health-care problem studied and the activities towards partners were less frequently discussed than the other three societal arguments. At the DHF, the five societal arguments were used equally often.

With the exception of the fellowship programme meeting, applicant-related arguments were not often used. The fellowship panel used arguments related to the applicant and to scientific quality about equally often. Committee-related arguments were also rarely used in the majority of the eight grant programmes observed. In three out of the ten panel meetings, one or two arguments were observed, which were related to personal experience with the applicant or their direct network. In seven out of ten meetings, statements were observed, which were unasserted or were explicitly announced as reflecting a personal preference. The frequency varied between one and seven statements (sixteen in total), which is low in comparison to the other arguments used (see  Fig. 1 for examples).

4.3 Use of arguments varied strongly per panel meeting

The balance in the use of scientific and societal arguments varied strongly per grant programme, panel, and organisation. At ZonMw, two meetings had approximately an equal balance in societal and scientific arguments. In the other two meetings, scientific arguments were used twice to four times as often as societal arguments. At the DHF, three types of panels were observed. Different patterns in the relative use of societal and scientific arguments were observed for each of these panel types. In the two CSQ-only meetings the societal arguments were used approximately twice as often as scientific arguments. In the two meetings of the scientific panels, societal arguments were infrequently used (between zero and four times per argument category). In the combined societal and scientific panel meetings, the use of societal and scientific arguments was more balanced.

4.4 Match of arguments used by panels with the assessment criteria

In order to answer our second research question, we looked into the relation of the arguments used with the formal criteria. We observed that a broader range of arguments were often used in comparison to how the criteria were described in the brochure and assessment instruction. However, arguments related to aspects that were consequently included in the brochure and instruction seemed to be discussed more frequently than in programmes where those aspects were not consistently included or were not included at all. Although the match of the science with the health-care problem and the background and reputation of the applicant were not always made explicit in the brochure or instructions, they were discussed in many panel meetings. Supplementary Fig. S1 provides a visualisation of how arguments used differ between the programmes in which those aspects were, were not, consistently included in the brochure and instruction forms.

4.5 Two-thirds of the assessment was driven by scientific panel members

To answer our third question, we looked into the differences in arguments used between panel members representing a scientific, clinical scientific, professional, policy, or patient perspective. In each research programme, the majority of panellists had a scientific background ( n  = 35), thirty-four members had a clinical scientific background, twenty had a health professional/clinical background, eight members represented a policy perspective, and fifteen represented a patient perspective. From the total number of arguments (1,097), two-thirds were made by members with a scientific or clinical scientific perspective. Members with a scientific background engaged most actively in the discussion with a mean of twelve arguments per member. Similarly, clinical scientists and health-care professionals participated with a mean of nine arguments, and members with a policy and patient perspective put forward the least number of arguments on average, namely, seven and eight. Figure 2 provides a complete overview of the total and mean number of arguments used by the different disciplines in the various panels.

The total and mean number of arguments displayed per subgroup of panel members.

The total and mean number of arguments displayed per subgroup of panel members.

4.6 Diverse use of arguments by panellists, but background matters

In meetings of both organisations, we observed a diverse use of arguments by the panel members. Yet, the use of arguments varied depending on the background of the panel member (see  Fig. 3 ). Those with a scientific and clinical scientific perspective used primarily scientific arguments. As could be expected, health-care professionals and patients used societal arguments more often.

The use of arguments differentiated by panel member background.

The use of arguments differentiated by panel member background.

Further breakdown of arguments across backgrounds showed clear differences in the use of scientific arguments between the different disciplines of panellists. Scientists and clinical scientists discussed the feasibility of the aims more than twice as often as their second most often uttered element of scientific quality, which was the match between the science and the problem studied . Patients and members with a policy or health professional background put forward fewer but more varied scientific arguments.

Patients and health-care professionals accounted for approximately half of the societal arguments used, despite being a much smaller part of the panel’s overall composition. In other words, members with a scientific perspective were less likely to use societal arguments. The relevance of the health-care problem studied, activities towards partners , and arguments related to participation and diversity were not used often by this group. Patients often used arguments related to patient participation and diversity and activities towards partners , although the frequency of the use of the latter differed per organisation.

The majority of the applicant-related arguments were put forward by scientists, including clinical scientists. Committee-related arguments were very rare and are therefore not differentiated by panel member background, except comments related to a comparison with other applications. These arguments were mainly put forward by panel members with a scientific background. HTA -related arguments were often used by panel members with a scientific perspective. Panel members with other perspectives used this argument scarcely (see Supplementary Figs S2–S4 for the visual presentation of the differences between panel members on all aspects included in the matrix).

5.1 Explanations for arguments used in panels

Our observations show that most arguments for scientific quality were often used. However, except for the feasibility , the frequency of arguments used varied strongly between the meetings and between the individual proposals that were discussed. The fact that most arguments were not consistently used is not surprising given the results from previous studies that showed heterogeneity in grant application assessments and low consistency in comments and scores by independent reviewers ( Abdoul et al. 2012 ; Pier et al. 2018 ). In an analysis of written assessments on nine observed dimensions, no dimension was used in more than 45 per cent of the reviews ( Hartmann and Neidhardt 1990 ).

There are several possible explanations for this heterogeneity. Roumbanis (2021a) described how being responsive to the different challenges in the proposals and to the points of attention arising from the written assessments influenced discussion in panels. Also when a disagreement arises, more time is spent on discussion ( Roumbanis 2021a ). One could infer that unambiguous, and thus not debated, aspects might remain largely undetected in our study. We believe, however, that the main points relevant to the assessment will not remain entirely unmentioned, because most panels in our study started the discussion with a short summary of the proposal, the written assessment, and the rebuttal. Lamont (2009) , however, points out that opening statements serve more goals than merely decision-making. They can also increase the credibility of the panellist, showing their comprehension and balanced assessment of an application. We can therefore not entirely disentangle whether the arguments observed most were also found to be most important or decisive or those were simply the topics that led to most disagreement.

An interesting difference with Roumbanis’ study was the available discussion time per proposal. In our study, most panels handled a limited number of proposals, allowing for longer discussions in comparison with the often 2-min time frame that Roumbanis (2021b) described, potentially contributing to a wider range of arguments being discussed. Limited time per proposal might also limit the number of panellists contributing to the discussion per proposal ( De Bont 2014 ).

5.2 Reducing heterogeneity by improving operationalisation and the consequent use of assessment criteria

We found that the language used for the operationalisation of the assessment criteria in programme brochures and in the observation matrix was much more detailed than in the instruction for the panel, which was often very concise. The exercise also illustrated that many terms were used interchangeably.

This was especially true for the applicant-related aspects. Several panels discussed how talent should be assessed. This confusion is understandable when considering the changing values in research and its assessment ( Moher et al. 2018 ) and the fact that the instruction of the funders was very concise. For example, it was not explicated whether the individual or the team should be assessed. Arensbergen et al. (2014b) described how in grant allocation processes, talent is generally assessed using limited characteristics. More objective and quantifiable outputs often prevailed at the expense of recognising and rewarding a broad variety of skills and traits combining professional, social, and individual capital ( DORA 2013 ).

In addition, committee-related arguments, like personal experiences with the applicant or their institute, were rarely used in our study. Comparisons between proposals were sometimes made without further argumentation, mainly by scientific panel members. This was especially pronounced in one (fellowship) grant programme with a high number of proposals. In this programme, the panel meeting concentrated on quickly comparing the quality of the applicants and of the proposals based on the reviewer’s judgement, instead of a more in-depth discussion of the different aspects of the proposals. Because the review phase was not part of this study, the question of which aspects have been used for the assessment of the proposals in this panel therefore remains partially unanswered. However, weighing and comparing proposals on different aspects and with different inputs is a core element of scientific peer review, both in the review of papers and in the review of grants ( Hirschauer 2010 ). The large role of scientific panel members in comparing proposals is therefore not surprising.

One could anticipate that more consequent language in the operationalising criteria may lead to more clarity for both applicants and panellists and to more consistency in the assessment of research proposals. The trend in our observations was that arguments were used less when the related criteria were not or were consequently included in the brochure and panel instruction. It remains, however, challenging to disentangle the influence of the formal definitions of criteria on the arguments used. Previous studies also encountered difficulties in studying the role of the formal instruction in peer review but concluded that this role is relatively limited ( Langfeldt 2001 ; Reinhart 2010 ).

The lack of a clear operationalisation of criteria can contribute to heterogeneity in peer review as many scholars found that assessors differ in the conceptualisation of good science and to the importance they attach to various aspects of research quality and societal relevance ( Abdoul et al. 2012 ; Geurts 2016 ; Scholten et al. 2018 ; Van den Brink et al. 2016 ). The large variation and absence of a gold standard in the interpretation of scientific quality and societal relevance affect the consistency of peer review. As a consequence, it is challenging to systematically evaluate and improve peer review in order to fund the research that contributes most to science and society. To contribute to responsible research and innovation, it is, therefore, important that funders invest in a more consistent and conscientious peer review process ( Curry et al. 2020 ; DORA 2013 ).

A common conceptualisation of scientific quality and societal relevance and impact could improve the alignment between views on good scientific conduct, programmes’ objectives, and the peer review in practice. Such a conceptualisation could contribute to more transparency and quality in the assessment of research. By involving panel members from all relevant backgrounds, including the research community, health-care professionals, and societal actors, in a better operationalisation of criteria, more inclusive views of good science can be implemented more systematically in the peer review assessment of research proposals. The ZonMw Framework Fostering Responsible Research Practices is an example of an initiative aiming to support standardisation and integration ( Reijmerink et al. 2020 ).

Given the lack of a common definition or conceptualisation of scientific quality and societal relevance, our study made an important decision by choosing to use a fixed set of detailed aspects of two important criteria as a gold standard to score the brochures, the panel instructions, and the arguments used by the panels. This approach proved helpful in disentangling the different components of scientific quality and societal relevance. Having said that, it is important not to oversimplify the causes for heterogeneity in peer review because these substantive arguments are not independent of non-cognitive, emotional, or social aspects ( Lamont and Guetzkow 2016 ; Reinhart 2010 ).

5.3 Do more diverse panels contribute to a broader use of arguments?

Both funders participating in our study have an outspoken public mission that requests sufficient attention to societal aspects in assessment processes. In reality, as observed in several panels, the main focus of peer review meetings is on scientific arguments. Next to the possible explanations earlier, the composition of the panel might play a role in explaining arguments used in panel meetings. Our results have shown that health-care professionals and patients bring in more societal arguments than scientists, including those who are also clinicians. It is, however, not that simple. In the more diverse panels, panel members, regardless of their backgrounds, used more societal arguments than in the less diverse panels.

Observing ten panel meetings was sufficient to explore differences in arguments used by panel members with different backgrounds. The pattern of (primarily) scientific arguments being raised by panels with mainly scientific members is not surprising. After all, it is their main task to assess the scientific content of grant proposals and fit their competencies. As such, one could argue, depending on how one justifies the relationship between science and society, that health-care professionals and patients might be better suited to assess the value for potential users of research results. Scientific panel members and clinical scientists in our study used less arguments that reflect on opening up and connecting science directly to others who can bring it further (being industry, health-care professionals, or other stakeholders). Patients filled this gap since these two types of arguments were the most prevalent type put forward by them. Making an active connection with society apparently needs a broader, more diverse panel for scientists to direct their attention to more societal arguments. Evident from our observations is that in panels with patients and health-care professionals, their presence seemed to increase the attention placed on arguments beyond the scientific arguments put forward by all panel members, including scientists. This conclusion is congruent with the observation that there was a more equal balance in the use of societal and scientific arguments in the scientific panels in which the CSQ participated. This illustrates that opening up peer review panels to non-scientific members creates an opportunity to focus on both the contribution and the integrative rationality ( Glerup and Horst 2014 ) or, in other words, to allow productive interactions between scientific and non-scientific actors. This corresponds with previous research that suggests that with regard to societal aspects, reviews from mixed panels were broader and richer ( Luo et al. 2021 ). In panels with non-scientific experts, more emphasis was placed on the role of the proposed research process to increase the likelihood of societal impact over the causal importance of scientific excellence for broader impacts. This is in line with the findings that panels with more disciplinary diversity, in range and also by including generalist experts, applied more versatile styles to reach consensus and paid more attention to relevance and pragmatic value ( Huutoniemi 2012 ).

Our observations further illustrate that patients and health-care professionals were less vocal in panels than (clinical) scientists and were in the minority. This could reflect their social role and lower perceived authority in the panel. Several guides are available for funders to stimulate the equal participation of patients in science. These guides are also applicable to their involvement in peer review panels. Measures to be taken include the support and training to help prepare patients for their participation in deliberations with renowned scientists and explicitly addressing power differences ( De Wit et al. 2016 ). Panel chairs and programme officers have to set and supervise the conditions for the functioning of both the individual panel members and the panel as a whole ( Lamont 2009 ).

5.4 Suggestions for future studies

In future studies, it is important to further disentangle the role of the operationalisation and appraisal of assessment criteria in reducing heterogeneity in the arguments used by panels. More controlled experimental settings are a valuable addition to the current mainly observational methodologies applied to disentangle some of the cognitive and social factors that influence the functioning and argumentation of peer review panels. Reusing data from the panel observations and the data on the written reports could also provide a starting point for a bottom-up approach to create a more consistent and shared conceptualisation and operationalisation of assessment criteria.

To further understand the effects of opening up review panels to non-scientific peers, it is valuable to compare the role of diversity and interdisciplinarity in solely scientific panels versus panels that also include non-scientific experts.

In future studies, differences between domains and types of research should also be addressed. We hypothesise that biomedical and health research is perhaps more suited for the inclusion of non-scientific peers in panels than other research domains. For example, it is valuable to better understand how potentially relevant users can be well enough identified in other research fields and to what extent non-academics can contribute to assessing the possible value of, especially early or blue sky, research.

The goal of our study was to explore in practice which arguments regarding the main criteria of scientific quality and societal relevance were used by peer review panels of biomedical and health research funding programmes. We showed that there is a wide diversity in the number and range of arguments used, but three main scientific aspects were discussed most frequently. These are the following: is it a feasible approach; does the science match the problem , and is the work plan scientifically sound? Nevertheless, these scientific aspects were accompanied by a significant amount of discussion of societal aspects, of which the contribution to a solution is the most prominent. In comparison with scientific panellists, non-scientific panellists, such as health-care professionals, policymakers, and patients, often use a wider range of arguments and other societal arguments. Even more striking was that, even though non-scientific peers were often outnumbered and less vocal in panels, scientists also used a wider range of arguments when non-scientific peers were present.

It is relevant that two health research funders collaborated in the current study to reflect on and improve peer review in research funding. There are few studies published that describe live observations of peer review panel meetings. Many studies focus on alternatives for peer review or reflect on the outcomes of the peer review process, instead of reflecting on the practice and improvement of peer review assessment of grant proposals. Privacy and confidentiality concerns of funders also contribute to the lack of information on the functioning of peer review panels. In this study, both organisations were willing to participate because of their interest in research funding policies in relation to enhancing the societal value and impact of science. The study provided them with practical suggestions, for example, on how to improve the alignment in language used in programme brochures and instructions of review panels, and contributed to valuable knowledge exchanges between organisations. We hope that this publication stimulates more research funders to evaluate their peer review approach in research funding and share their insights.

For a long time, research funders relied solely on scientists for designing and executing peer review of research proposals, thereby delegating responsibility for the process. Although review panels have a discretionary authority, it is important that funders set and supervise the process and the conditions. We argue that one of these conditions should be the diversification of peer review panels and opening up panels for non-scientific peers.

Supplementary material is available at Science and Public Policy online.

Details of the data and information on how to request access is available from the first author.

Joey Gijbels and Wendy Reijmerink are employed by ZonMw. Rebecca Abma-Schouten is employed by the Dutch Heart Foundation and as external PhD candidate affiliated with the Centre for Science and Technology Studies, Leiden University.

A special thanks to the panel chairs and programme officers of ZonMw and the DHF for their willingness to participate in this project. We thank Diny Stekelenburg, an internship student at ZonMw, for her contributions to the project. Our sincerest gratitude to Prof. Paul Wouters, Sarah Coombs, and Michiel van der Vaart for proofreading and their valuable feedback. Finally, we thank the editors and anonymous reviewers of Science and Public Policy for their thorough and insightful reviews and recommendations. Their contributions are recognisable in the final version of this paper.

Abdoul   H. , Perrey   C. , Amiel   P. , et al.  ( 2012 ) ‘ Peer Review of Grant Applications: Criteria Used and Qualitative Study of Reviewer Practices ’, PLoS One , 7 : 1 – 15 .

Google Scholar

Abma-Schouten   R. Y. ( 2017 ) ‘ Maatschappelijke Kwaliteit van Onderzoeksvoorstellen ’, Dutch Heart Foundation .

Alla   K. , Hall   W. D. , Whiteford   H. A. , et al.  ( 2017 ) ‘ How Do We Define the Policy Impact of Public Health Research? A Systematic Review ’, Health Research Policy and Systems , 15 : 84.

Benedictus   R. , Miedema   F. , and Ferguson   M. W. J. ( 2016 ) ‘ Fewer Numbers, Better Science ’, Nature , 538 : 453 – 4 .

Chalmers   I. , Bracken   M. B. , Djulbegovic   B. , et al.  ( 2014 ) ‘ How to Increase Value and Reduce Waste When Research Priorities Are Set ’, The Lancet , 383 : 156 – 65 .

Curry   S. , De Rijcke   S. , Hatch   A. , et al.  ( 2020 ) ‘ The Changing Role of Funders in Responsible Research Assessment: Progress, Obstacles and the Way Ahead ’, RoRI Working Paper No. 3, London : Research on Research Institute (RoRI) .

De Bont   A. ( 2014 ) ‘ Beoordelen Bekeken. Reflecties op het Werk van Een Programmacommissie van ZonMw ’, ZonMw .

De Rijcke   S. , Wouters   P. F. , Rushforth   A. D. , et al.  ( 2016 ) ‘ Evaluation Practices and Effects of Indicator Use—a Literature Review ’, Research Evaluation , 25 : 161 – 9 .

De Wit   A. M. , Bloemkolk   D. , Teunissen   T. , et al.  ( 2016 ) ‘ Voorwaarden voor Succesvolle Betrokkenheid van Patiënten/cliënten bij Medisch Wetenschappelijk Onderzoek ’, Tijdschrift voor Sociale Gezondheidszorg , 94 : 91 – 100 .

Del Carmen Calatrava Moreno   M. , Warta   K. , Arnold   E. , et al.  ( 2019 ) Science Europe Study on Research Assessment Practices . Technopolis Group Austria .

Google Preview

Demicheli   V. and Di Pietrantonj   C. ( 2007 ) ‘ Peer Review for Improving the Quality of Grant Applications ’, Cochrane Database of Systematic Reviews , 2 : MR000003.

Den Oudendammer   W. M. , Noordhoek   J. , Abma-Schouten   R. Y. , et al.  ( 2019 ) ‘ Patient Participation in Research Funding: An Overview of When, Why and How Amongst Dutch Health Funds ’, Research Involvement and Engagement , 5 .

Diabetesfonds ( n.d. ) Maatschappelijke Adviesraad < https://www.diabetesfonds.nl/over-ons/maatschappelijke-adviesraad > accessed 18 Sept 2022 .

Dijstelbloem   H. , Huisman   F. , Miedema   F. , et al.  ( 2013 ) ‘ Science in Transition Position Paper: Waarom de Wetenschap Niet Werkt Zoals het Moet, En Wat Daar aan te Doen Is ’, Utrecht : Science in Transition .

Forsyth   D. R. ( 1999 ) Group Dynamics , 3rd edn. Belmont : Wadsworth Publishing Company .

Geurts   J. ( 2016 ) ‘ Wat Goed Is, Herken Je Meteen ’, NRC Handelsblad < https://www.nrc.nl/nieuws/2016/10/28/wat-goed-is-herken-je-meteen-4975248-a1529050 > accessed 6 Mar 2022 .

Glerup   C. and Horst   M. ( 2014 ) ‘ Mapping “Social Responsibility” in Science ’, Journal of Responsible Innovation , 1 : 31 – 50 .

Hartmann   I. and Neidhardt   F. ( 1990 ) ‘ Peer Review at the Deutsche Forschungsgemeinschaft ’, Scientometrics , 19 : 419 – 25 .

Hirschauer   S. ( 2010 ) ‘ Editorial Judgments: A Praxeology of “Voting” in Peer Review ’, Social Studies of Science , 40 : 71 – 103 .

Hughes   A. and Kitson   M. ( 2012 ) ‘ Pathways to Impact and the Strategic Role of Universities: New Evidence on the Breadth and Depth of University Knowledge Exchange in the UK and the Factors Constraining Its Development ’, Cambridge Journal of Economics , 36 : 723 – 50 .

Huutoniemi   K. ( 2012 ) ‘ Communicating and Compromising on Disciplinary Expertise in the Peer Review of Research Proposals ’, Social Studies of Science , 42 : 897 – 921 .

Jasanoff   S. ( 2011 ) ‘ Constitutional Moments in Governing Science and Technology ’, Science and Engineering Ethics , 17 : 621 – 38 .

Kolarz   P. , Arnold   E. , Farla   K. , et al.  ( 2016 ) Evaluation of the ESRC Transformative Research Scheme . Brighton : Technopolis Group .

Lamont   M. ( 2009 ) How Professors Think : Inside the Curious World of Academic Judgment . Cambridge : Harvard University Press .

Lamont   M. Guetzkow   J. ( 2016 ) ‘How Quality Is Recognized by Peer Review Panels: The Case of the Humanities’, in M.   Ochsner , S. E.   Hug , and H.-D.   Daniel (eds) Research Assessment in the Humanities , pp. 31 – 41 . Cham : Springer International Publishing .

Lamont   M. Huutoniemi   K. ( 2011 ) ‘Comparing Customary Rules of Fairness: Evaluative Practices in Various Types of Peer Review Panels’, in C.   Charles   G.   Neil and L.   Michèle (eds) Social Knowledge in the Making , pp. 209–32. Chicago : The University of Chicago Press .

Langfeldt   L. ( 2001 ) ‘ The Decision-making Constraints and Processes of Grant Peer Review, and Their Effects on the Review Outcome ’, Social Studies of Science , 31 : 820 – 41 .

——— ( 2006 ) ‘ The Policy Challenges of Peer Review: Managing Bias, Conflict of Interests and Interdisciplinary Assessments ’, Research Evaluation , 15 : 31 – 41 .

Lee   C. J. , Sugimoto   C. R. , Zhang   G. , et al.  ( 2013 ) ‘ Bias in Peer Review ’, Journal of the American Society for Information Science and Technology , 64 : 2 – 17 .

Liu   F. Maitlis   S. ( 2010 ) ‘Nonparticipant Observation’, in A. J.   Mills , G.   Durepos , and E.   Wiebe (eds) Encyclopedia of Case Study Research , pp. 609 – 11 . Los Angeles : SAGE .

Luo   J. , Ma   L. , and Shankar   K. ( 2021 ) ‘ Does the Inclusion of Non-academix Reviewers Make Any Difference for Grant Impact Panels? ’, Science & Public Policy , 48 : 763 – 75 .

Luukkonen   T. ( 2012 ) ‘ Conservatism and Risk-taking in Peer Review: Emerging ERC Practices ’, Research Evaluation , 21 : 48 – 60 .

Macleod   M. R. , Michie   S. , Roberts   I. , et al.  ( 2014 ) ‘ Biomedical Research: Increasing Value, Reducing Waste ’, The Lancet , 383 : 101 – 4 .

Meijer   I. M. ( 2012 ) ‘ Societal Returns of Scientific Research. How Can We Measure It? ’, Leiden : Center for Science and Technology Studies, Leiden University .

Merton   R. K. ( 1968 ) Social Theory and Social Structure , Enlarged edn. [Nachdr.] . New York : The Free Press .

Moher   D. , Naudet   F. , Cristea   I. A. , et al.  ( 2018 ) ‘ Assessing Scientists for Hiring, Promotion, And Tenure ’, PLoS Biology , 16 : e2004089.

Olbrecht   M. and Bornmann   L. ( 2010 ) ‘ Panel Peer Review of Grant Applications: What Do We Know from Research in Social Psychology on Judgment and Decision-making in Groups? ’, Research Evaluation , 19 : 293 – 304 .

Patiëntenfederatie Nederland ( n.d. ) Ervaringsdeskundigen Referentenpanel < https://www.patientenfederatie.nl/zet-je-ervaring-in/lid-worden-van-ons-referentenpanel > accessed 18 Sept 2022.

Pier   E. L. , M.   B. , Filut   A. , et al.  ( 2018 ) ‘ Low Agreement among Reviewers Evaluating the Same NIH Grant Applications ’, Proceedings of the National Academy of Sciences , 115 : 2952 – 7 .

Prinses Beatrix Spierfonds ( n.d. ) Gebruikerscommissie < https://www.spierfonds.nl/wie-wij-zijn/gebruikerscommissie > accessed 18 Sep 2022 .

( 2020 ) Private Non-profit Financiering van Onderzoek in Nederland < https://www.rathenau.nl/nl/wetenschap-cijfers/geld/wat-geeft-nederland-uit-aan-rd/private-non-profit-financiering-van#:∼:text=R%26D%20in%20Nederland%20wordt%20gefinancierd,aan%20wetenschappelijk%20onderzoek%20in%20Nederland > accessed 6 Mar 2022 .

Reneman   R. S. , Breimer   M. L. , Simoons   J. , et al.  ( 2010 ) ‘ De toekomst van het cardiovasculaire onderzoek in Nederland. Sturing op synergie en impact ’, Den Haag : Nederlandse Hartstichting .

Reed   M. S. , Ferré   M. , Marin-Ortega   J. , et al.  ( 2021 ) ‘ Evaluating Impact from Research: A Methodological Framework ’, Research Policy , 50 : 104147.

Reijmerink   W. and Oortwijn   W. ( 2017 ) ‘ Bevorderen van Verantwoorde Onderzoekspraktijken Door ZonMw ’, Beleidsonderzoek Online. accessed 6 Mar 2022.

Reijmerink   W. , Vianen   G. , Bink   M. , et al.  ( 2020 ) ‘ Ensuring Value in Health Research by Funders’ Implementation of EQUATOR Reporting Guidelines: The Case of ZonMw ’, Berlin : REWARD|EQUATOR .

Reinhart   M. ( 2010 ) ‘ Peer Review Practices: A Content Analysis of External Reviews in Science Funding ’, Research Evaluation , 19 : 317 – 31 .

Reinhart   M. and Schendzielorz   C. ( 2021 ) Trends in Peer Review . SocArXiv . < https://osf.io/preprints/socarxiv/nzsp5 > accessed 29 Aug 2022.

Roumbanis   L. ( 2017 ) ‘ Academic Judgments under Uncertainty: A Study of Collective Anchoring Effects in Swedish Research Council Panel Groups ’, Social Studies of Science , 47 : 95 – 116 .

——— ( 2021a ) ‘ Disagreement and Agonistic Chance in Peer Review ’, Science, Technology & Human Values , 47 : 1302 – 33 .

——— ( 2021b ) ‘ The Oracles of Science: On Grant Peer Review and Competitive Funding ’, Social Science Information , 60 : 356 – 62 .

( 2019 ) ‘ Ruimte voor ieders talent (Position Paper) ’, Den Haag : VSNU, NFU, KNAW, NWO en ZonMw . < https://www.universiteitenvannederland.nl/recognitionandrewards/wp-content/uploads/2019/11/Position-paper-Ruimte-voor-ieders-talent.pdf >.

( 2013 ) San Francisco Declaration on Research Assessment . The Declaration . < https://sfdora.org > accessed 2 Jan 2022 .

Sarewitz   D. and Pielke   R. A.  Jr. ( 2007 ) ‘ The Neglected Heart of Science Policy: Reconciling Supply of and Demand for Science ’, Environmental Science & Policy , 10 : 5 – 16 .

Scholten   W. , Van Drooge   L. , and Diederen   P. ( 2018 ) Excellent Is Niet Gewoon. Dertig Jaar Focus op Excellentie in het Nederlandse Wetenschapsbeleid . The Hague : Rathenau Instituut .

Shapin   S. ( 2008 ) The Scientific Life : A Moral History of a Late Modern Vocation . Chicago : University of Chicago press .

Spaapen   J. and Van Drooge   L. ( 2011 ) ‘ Introducing “Productive Interactions” in Social Impact Assessment ’, Research Evaluation , 20 : 211 – 8 .

Travis   G. D. L. and Collins   H. M. ( 1991 ) ‘ New Light on Old Boys: Cognitive and Institutional Particularism in the Peer Review System ’, Science, Technology & Human Values , 16 : 322 – 41 .

Van Arensbergen   P. and Van den Besselaar   P. ( 2012 ) ‘ The Selection of Scientific Talent in the Allocation of Research Grants ’, Higher Education Policy , 25 : 381 – 405 .

Van Arensbergen   P. , Van der Weijden   I. , and Van den Besselaar   P. V. D. ( 2014a ) ‘ The Selection of Talent as a Group Process: A Literature Review on the Social Dynamics of Decision Making in Grant Panels ’, Research Evaluation , 23 : 298 – 311 .

—— ( 2014b ) ‘ Different Views on Scholarly Talent: What Are the Talents We Are Looking for in Science? ’, Research Evaluation , 23 : 273 – 84 .

Van den Brink , G. , Scholten , W. , and Jansen , T. , eds ( 2016 ) Goed Werk voor Academici . Culemborg : Stichting Beroepseer .

Weingart   P. ( 1999 ) ‘ Scientific Expertise and Political Accountability: Paradoxes of Science in Politics ’, Science & Public Policy , 26 : 151 – 61 .

Wessely   S. ( 1998 ) ‘ Peer Review of Grant Applications: What Do We Know? ’, The Lancet , 352 : 301 – 5 .

Supplementary data

Month: Total Views:
April 2023 723
May 2023 266
June 2023 152
July 2023 130
August 2023 355
September 2023 189
October 2023 198
November 2023 181
December 2023 153
January 2024 197
February 2024 222
March 2024 227
April 2024 218
May 2024 229
June 2024 81

Email alerts

Citing articles via.

  • Recommend to your Library

Affiliations

  • Online ISSN 1471-5430
  • Print ISSN 0302-3427
  • Copyright © 2024 Oxford University Press
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Starting the research process
  • How to Write a Research Proposal | Examples & Templates

How to Write a Research Proposal | Examples & Templates

Published on October 12, 2022 by Shona McCombes and Tegan George. Revised on November 21, 2023.

Structure of a research proposal

A research proposal describes what you will investigate, why it’s important, and how you will conduct your research.

The format of a research proposal varies between fields, but most proposals will contain at least these elements:

Introduction

Literature review.

  • Research design

Reference list

While the sections may vary, the overall objective is always the same. A research proposal serves as a blueprint and guide for your research plan, helping you get organized and feel confident in the path forward you choose to take.

Table of contents

Research proposal purpose, research proposal examples, research design and methods, contribution to knowledge, research schedule, other interesting articles, frequently asked questions about research proposals.

Academics often have to write research proposals to get funding for their projects. As a student, you might have to write a research proposal as part of a grad school application , or prior to starting your thesis or dissertation .

In addition to helping you figure out what your research can look like, a proposal can also serve to demonstrate why your project is worth pursuing to a funder, educational institution, or supervisor.

Research proposal aims
Show your reader why your project is interesting, original, and important.
Demonstrate your comfort and familiarity with your field.
Show that you understand the current state of research on your topic.
Make a case for your .
Demonstrate that you have carefully thought about the data, tools, and procedures necessary to conduct your research.
Confirm that your project is feasible within the timeline of your program or funding deadline.

Research proposal length

The length of a research proposal can vary quite a bit. A bachelor’s or master’s thesis proposal can be just a few pages, while proposals for PhD dissertations or research funding are usually much longer and more detailed. Your supervisor can help you determine the best length for your work.

One trick to get started is to think of your proposal’s structure as a shorter version of your thesis or dissertation , only without the results , conclusion and discussion sections.

Download our research proposal template

Here's why students love Scribbr's proofreading services

Discover proofreading & editing

Writing a research proposal can be quite challenging, but a good starting point could be to look at some examples. We’ve included a few for you below.

  • Example research proposal #1: “A Conceptual Framework for Scheduling Constraint Management”
  • Example research proposal #2: “Medical Students as Mediators of Change in Tobacco Use”

Like your dissertation or thesis, the proposal will usually have a title page that includes:

  • The proposed title of your project
  • Your supervisor’s name
  • Your institution and department

The first part of your proposal is the initial pitch for your project. Make sure it succinctly explains what you want to do and why.

Your introduction should:

  • Introduce your topic
  • Give necessary background and context
  • Outline your  problem statement  and research questions

To guide your introduction , include information about:

  • Who could have an interest in the topic (e.g., scientists, policymakers)
  • How much is already known about the topic
  • What is missing from this current knowledge
  • What new insights your research will contribute
  • Why you believe this research is worth doing

Prevent plagiarism. Run a free check.

As you get started, it’s important to demonstrate that you’re familiar with the most important research on your topic. A strong literature review  shows your reader that your project has a solid foundation in existing knowledge or theory. It also shows that you’re not simply repeating what other people have already done or said, but rather using existing research as a jumping-off point for your own.

In this section, share exactly how your project will contribute to ongoing conversations in the field by:

  • Comparing and contrasting the main theories, methods, and debates
  • Examining the strengths and weaknesses of different approaches
  • Explaining how will you build on, challenge, or synthesize prior scholarship

Following the literature review, restate your main  objectives . This brings the focus back to your own project. Next, your research design or methodology section will describe your overall approach, and the practical steps you will take to answer your research questions.

Building a research proposal methodology
? or  ? , , or research design?
, )? ?
, , , )?
?

To finish your proposal on a strong note, explore the potential implications of your research for your field. Emphasize again what you aim to contribute and why it matters.

For example, your results might have implications for:

  • Improving best practices
  • Informing policymaking decisions
  • Strengthening a theory or model
  • Challenging popular or scientific beliefs
  • Creating a basis for future research

Last but not least, your research proposal must include correct citations for every source you have used, compiled in a reference list . To create citations quickly and easily, you can use our free APA citation generator .

Some institutions or funders require a detailed timeline of the project, asking you to forecast what you will do at each stage and how long it may take. While not always required, be sure to check the requirements of your project.

Here’s an example schedule to help you get started. You can also download a template at the button below.

Download our research schedule template

Example research schedule
Research phase Objectives Deadline
1. Background research and literature review 20th January
2. Research design planning and data analysis methods 13th February
3. Data collection and preparation with selected participants and code interviews 24th March
4. Data analysis of interview transcripts 22nd April
5. Writing 17th June
6. Revision final work 28th July

If you are applying for research funding, chances are you will have to include a detailed budget. This shows your estimates of how much each part of your project will cost.

Make sure to check what type of costs the funding body will agree to cover. For each item, include:

  • Cost : exactly how much money do you need?
  • Justification : why is this cost necessary to complete the research?
  • Source : how did you calculate the amount?

To determine your budget, think about:

  • Travel costs : do you need to go somewhere to collect your data? How will you get there, and how much time will you need? What will you do there (e.g., interviews, archival research)?
  • Materials : do you need access to any tools or technologies?
  • Help : do you need to hire any research assistants for the project? What will they do, and how much will you pay them?

If you want to know more about the research process , methodology , research bias , or statistics , make sure to check out some of our other articles with explanations and examples.

Methodology

  • Sampling methods
  • Simple random sampling
  • Stratified sampling
  • Cluster sampling
  • Likert scales
  • Reproducibility

 Statistics

  • Null hypothesis
  • Statistical power
  • Probability distribution
  • Effect size
  • Poisson distribution

Research bias

  • Optimism bias
  • Cognitive bias
  • Implicit bias
  • Hawthorne effect
  • Anchoring bias
  • Explicit bias

Once you’ve decided on your research objectives , you need to explain them in your paper, at the end of your problem statement .

Keep your research objectives clear and concise, and use appropriate verbs to accurately convey the work that you will carry out for each one.

I will compare …

A research aim is a broad statement indicating the general purpose of your research project. It should appear in your introduction at the end of your problem statement , before your research objectives.

Research objectives are more specific than your research aim. They indicate the specific ways you’ll address the overarching aim.

A PhD, which is short for philosophiae doctor (doctor of philosophy in Latin), is the highest university degree that can be obtained. In a PhD, students spend 3–5 years writing a dissertation , which aims to make a significant, original contribution to current knowledge.

A PhD is intended to prepare students for a career as a researcher, whether that be in academia, the public sector, or the private sector.

A master’s is a 1- or 2-year graduate degree that can prepare you for a variety of careers.

All master’s involve graduate-level coursework. Some are research-intensive and intend to prepare students for further study in a PhD; these usually require their students to write a master’s thesis . Others focus on professional training for a specific career.

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

The best way to remember the difference between a research plan and a research proposal is that they have fundamentally different audiences. A research plan helps you, the researcher, organize your thoughts. On the other hand, a dissertation proposal or research proposal aims to convince others (e.g., a supervisor, a funding body, or a dissertation committee) that your research topic is relevant and worthy of being conducted.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

McCombes, S. & George, T. (2023, November 21). How to Write a Research Proposal | Examples & Templates. Scribbr. Retrieved June 18, 2024, from https://www.scribbr.com/research-process/research-proposal/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, how to write a problem statement | guide & examples, writing strong research questions | criteria & examples, how to write a literature review | guide, examples, & templates, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

Writing a Research Proposal

  • First Online: 10 April 2022

Cite this chapter

importance of evaluating research proposal

  • Fahimeh Tabatabaei 3 &
  • Lobat Tayebi 3  

941 Accesses

A research proposal is a roadmap that brings the researcher closer to the objectives, takes the research topic from a purely subjective mind, and manifests an objective plan. It shows us what steps we need to take to reach the objective, what questions we should answer, and how much time we need. It is a framework based on which you can perform your research in a well-organized and timely manner. In other words, by writing a research proposal, you get a map that shows the direction to the destination (answering the research question). If the proposal is poorly prepared, after spending a lot of energy and money, you may realize that the result of the research has nothing to do with the initial objective, and the study may end up nowhere. Therefore, writing the proposal shows that the researcher is aware of the proper research and can justify the significance of his/her idea.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

A. Gholipour, E.Y. Lee, S.K. Warfield, The anatomy and art of writing a successful grant application: A practical step-by-step approach. Pediatr. Radiol. 44 (12), 1512–1517 (2014)

Article   Google Scholar  

L.S. Marshall, Research commentary: Grant writing: Part I first things first …. J. Radiol. Nurs. 31 (4), 154–155 (2012)

E.K. Proctor, B.J. Powell, A.A. Baumann, A.M. Hamilton, R.L. Santens, Writing implementation research grant proposals: Ten key ingredients. Implement. Sci. 7 (1), 96 (2012)

K.C. Chung, M.J. Shauver, Fundamental principles of writing a successful grant proposal. J. Hand Surg. Am. 33 (4), 566–572 (2008)

A.A. Monte, A.M. Libby, Introduction to the specific aims page of a grant proposal. Kline JA, editor. Acad. Emerg. Med. 25 (9), 1042–1047 (2018)

P. Kan, M.R. Levitt, W.J. Mack, R.M. Starke, K.N. Sheth, F.C. Albuquerque, et al., National Institutes of Health grant opportunities for the neurointerventionalist: Preparation and choosing the right mechanism. J. Neurointerv. Surg. 13 (3), 287–289 (2021)

A.M. Goldstein, S. Balaji, A.A. Ghaferi, A. Gosain, M. Maggard-Gibbons, B. Zuckerbraun, et al., An algorithmic approach to an impactful specific aims page. Surgery 169 (4), 816–820 (2021)

S. Engberg, D.Z. Bliss, Writing a grant proposal—Part 1. J. Wound Ostomy Cont. Nurs. 32 (3), 157–162 (2005)

D.Z. Bliss, K. Savik, Writing a grant proposal—Part 2. J. Wound Ostomy Cont. Nurs. 32 (4), 226–229 (2005)

D.Z. Bliss, Writing a grant proposal—Part 6. J. Wound Ostomy Cont. Nurs. 32 (6), 365–367 (2005)

J.C. Liu, M.A. Pynnonen, M. St John, E.L. Rosenthal, M.E. Couch, C.E. Schmalbach, Grant-writing pearls and pitfalls. Otolaryngol. Neck. Surg. 154 (2), 226–232 (2016)

R.J. Santen, E.J. Barrett, H.M. Siragy, L.S. Farhi, L. Fishbein, R.M. Carey, The jewel in the crown: Specific aims section of investigator-initiated grant proposals. J. Endocr. Soc. 1 (9), 1194–1202 (2017)

O.J. Arthurs, Think it through first: Questions to consider in writing a successful grant application. Pediatr. Radiol. 44 (12), 1507–1511 (2014)

M. Monavarian, Basics of scientific and technical writing. MRS Bull. 46 (3), 284–286 (2021)

Additional Resources

https://grants.nih.gov

https://grants.nih.gov/grants/oer.htm

https://www.ninr.nih.gov

https://www.niaid.nih.gov

http://www.grantcentral.com

http://www.saem.org/research

http://www.cfda.gov

http://www.ahrq.gov

http://www.nsf/gov

Download references

Author information

Authors and affiliations.

School of Dentistry, Marquette University, Milwaukee, WI, USA

Fahimeh Tabatabaei & Lobat Tayebi

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this chapter

Tabatabaei, F., Tayebi, L. (2022). Writing a Research Proposal. In: Research Methods in Dentistry. Springer, Cham. https://doi.org/10.1007/978-3-030-98028-3_4

Download citation

DOI : https://doi.org/10.1007/978-3-030-98028-3_4

Published : 10 April 2022

Publisher Name : Springer, Cham

Print ISBN : 978-3-030-98027-6

Online ISBN : 978-3-030-98028-3

eBook Packages : Engineering Engineering (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Grad Coach

What (Exactly) Is A Research Proposal?

A simple explainer with examples + free template.

By: Derek Jansen (MBA) | Reviewed By: Dr Eunice Rautenbach | June 2020 (Updated April 2023)

Whether you’re nearing the end of your degree and your dissertation is on the horizon, or you’re planning to apply for a PhD program, chances are you’ll need to craft a convincing research proposal . If you’re on this page, you’re probably unsure exactly what the research proposal is all about. Well, you’ve come to the right place.

Overview: Research Proposal Basics

  • What a research proposal is
  • What a research proposal needs to cover
  • How to structure your research proposal
  • Example /sample proposals
  • Proposal writing FAQs
  • Key takeaways & additional resources

What is a research proposal?

Simply put, a research proposal is a structured, formal document that explains what you plan to research (your research topic), why it’s worth researching (your justification), and how  you plan to investigate it (your methodology). 

The purpose of the research proposal (its job, so to speak) is to convince  your research supervisor, committee or university that your research is  suitable  (for the requirements of the degree program) and  manageable  (given the time and resource constraints you will face). 

The most important word here is “ convince ” – in other words, your research proposal needs to  sell  your research idea (to whoever is going to approve it). If it doesn’t convince them (of its suitability and manageability), you’ll need to revise and resubmit . This will cost you valuable time, which will either delay the start of your research or eat into its time allowance (which is bad news). 

A research proposal is a  formal document that explains what you plan to research , why it's worth researching and how you'll do it.

What goes into a research proposal?

A good dissertation or thesis proposal needs to cover the “ what “, “ why ” and” how ” of the proposed study. Let’s look at each of these attributes in a little more detail:

Your proposal needs to clearly articulate your research topic . This needs to be specific and unambiguous . Your research topic should make it clear exactly what you plan to research and in what context. Here’s an example of a well-articulated research topic:

An investigation into the factors which impact female Generation Y consumer’s likelihood to promote a specific makeup brand to their peers: a British context

As you can see, this topic is extremely clear. From this one line we can see exactly:

  • What’s being investigated – factors that make people promote or advocate for a brand of a specific makeup brand
  • Who it involves – female Gen-Y consumers
  • In what context – the United Kingdom

So, make sure that your research proposal provides a detailed explanation of your research topic . If possible, also briefly outline your research aims and objectives , and perhaps even your research questions (although in some cases you’ll only develop these at a later stage). Needless to say, don’t start writing your proposal until you have a clear topic in mind , or you’ll end up waffling and your research proposal will suffer as a result of this.

Need a helping hand?

importance of evaluating research proposal

As we touched on earlier, it’s not good enough to simply propose a research topic – you need to justify why your topic is original . In other words, what makes it  unique ? What gap in the current literature does it fill? If it’s simply a rehash of the existing research, it’s probably not going to get approval – it needs to be fresh.

But,  originality  alone is not enough. Once you’ve ticked that box, you also need to justify why your proposed topic is  important . In other words, what value will it add to the world if you achieve your research aims?

As an example, let’s look at the sample research topic we mentioned earlier (factors impacting brand advocacy). In this case, if the research could uncover relevant factors, these findings would be very useful to marketers in the cosmetics industry, and would, therefore, have commercial value . That is a clear justification for the research.

So, when you’re crafting your research proposal, remember that it’s not enough for a topic to simply be unique. It needs to be useful and value-creating – and you need to convey that value in your proposal. If you’re struggling to find a research topic that makes the cut, watch  our video covering how to find a research topic .

Free Webinar: How To Write A Research Proposal

It’s all good and well to have a great topic that’s original and valuable, but you’re not going to convince anyone to approve it without discussing the practicalities – in other words:

  • How will you actually undertake your research (i.e., your methodology)?
  • Is your research methodology appropriate given your research aims?
  • Is your approach manageable given your constraints (time, money, etc.)?

While it’s generally not expected that you’ll have a fully fleshed-out methodology at the proposal stage, you’ll likely still need to provide a high-level overview of your research methodology . Here are some important questions you’ll need to address in your research proposal:

  • Will you take a qualitative , quantitative or mixed -method approach?
  • What sampling strategy will you adopt?
  • How will you collect your data (e.g., interviews, surveys, etc)?
  • How will you analyse your data (e.g., descriptive and inferential statistics , content analysis, discourse analysis, etc, .)?
  • What potential limitations will your methodology carry?

So, be sure to give some thought to the practicalities of your research and have at least a basic methodological plan before you start writing up your proposal. If this all sounds rather intimidating, the video below provides a good introduction to research methodology and the key choices you’ll need to make.

How To Structure A Research Proposal

Now that we’ve covered the key points that need to be addressed in a proposal, you may be wondering, “ But how is a research proposal structured? “.

While the exact structure and format required for a research proposal differs from university to university, there are four “essential ingredients” that commonly make up the structure of a research proposal:

  • A rich introduction and background to the proposed research
  • An initial literature review covering the existing research
  • An overview of the proposed research methodology
  • A discussion regarding the practicalities (project plans, timelines, etc.)

In the video below, we unpack each of these four sections, step by step.

Research Proposal Examples/Samples

In the video below, we provide a detailed walkthrough of two successful research proposals (Master’s and PhD-level), as well as our popular free proposal template.

Proposal Writing FAQs

How long should a research proposal be.

This varies tremendously, depending on the university, the field of study (e.g., social sciences vs natural sciences), and the level of the degree (e.g. undergraduate, Masters or PhD) – so it’s always best to check with your university what their specific requirements are before you start planning your proposal.

As a rough guide, a formal research proposal at Masters-level often ranges between 2000-3000 words, while a PhD-level proposal can be far more detailed, ranging from 5000-8000 words. In some cases, a rough outline of the topic is all that’s needed, while in other cases, universities expect a very detailed proposal that essentially forms the first three chapters of the dissertation or thesis.

The takeaway – be sure to check with your institution before you start writing.

How do I choose a topic for my research proposal?

Finding a good research topic is a process that involves multiple steps. We cover the topic ideation process in this video post.

How do I write a literature review for my proposal?

While you typically won’t need a comprehensive literature review at the proposal stage, you still need to demonstrate that you’re familiar with the key literature and are able to synthesise it. We explain the literature review process here.

How do I create a timeline and budget for my proposal?

We explain how to craft a project plan/timeline and budget in Research Proposal Bootcamp .

Which referencing format should I use in my research proposal?

The expectations and requirements regarding formatting and referencing vary from institution to institution. Therefore, you’ll need to check this information with your university.

What common proposal writing mistakes do I need to look out for?

We’ve create a video post about some of the most common mistakes students make when writing a proposal – you can access that here . If you’re short on time, here’s a quick summary:

  • The research topic is too broad (or just poorly articulated).
  • The research aims, objectives and questions don’t align.
  • The research topic is not well justified.
  • The study has a weak theoretical foundation.
  • The research design is not well articulated well enough.
  • Poor writing and sloppy presentation.
  • Poor project planning and risk management.
  • Not following the university’s specific criteria.

Key Takeaways & Additional Resources

As you write up your research proposal, remember the all-important core purpose:  to convince . Your research proposal needs to sell your study in terms of suitability and viability. So, focus on crafting a convincing narrative to ensure a strong proposal.

At the same time, pay close attention to your university’s requirements. While we’ve covered the essentials here, every institution has its own set of expectations and it’s essential that you follow these to maximise your chances of approval.

By the way, we’ve got plenty more resources to help you fast-track your research proposal. Here are some of our most popular resources to get you started:

  • Proposal Writing 101 : A Introductory Webinar
  • Research Proposal Bootcamp : The Ultimate Online Course
  • Template : A basic template to help you craft your proposal

If you’re looking for 1-on-1 support with your research proposal, be sure to check out our private coaching service , where we hold your hand through the proposal development process (and the entire research journey), step by step.

Literature Review Course

Psst… there’s more!

This post is an extract from our bestselling short course, Research Proposal Bootcamp . If you want to work smart, you don't want to miss this .

You Might Also Like:

Discourse analysis 101

51 Comments

Myrna Pereira

I truly enjoyed this video, as it was eye-opening to what I have to do in the preparation of preparing a Research proposal.

I would be interested in getting some coaching.

BARAKAELI TEREVAELI

I real appreciate on your elaboration on how to develop research proposal,the video explains each steps clearly.

masebo joseph

Thank you for the video. It really assisted me and my niece. I am a PhD candidate and she is an undergraduate student. It is at times, very difficult to guide a family member but with this video, my job is done.

In view of the above, I welcome more coaching.

Zakia Ghafoor

Wonderful guidelines, thanks

Annie Malupande

This is very helpful. Would love to continue even as I prepare for starting my masters next year.

KYARIKUNDA MOREEN

Thanks for the work done, the text was helpful to me

Ahsanullah Mangal

Bundle of thanks to you for the research proposal guide it was really good and useful if it is possible please send me the sample of research proposal

Derek Jansen

You’re most welcome. We don’t have any research proposals that we can share (the students own the intellectual property), but you might find our research proposal template useful: https://gradcoach.com/research-proposal-template/

Cheruiyot Moses Kipyegon

Cheruiyot Moses Kipyegon

Thanks alot. It was an eye opener that came timely enough before my imminent proposal defense. Thanks, again

agnelius

thank you very much your lesson is very interested may God be with you

Abubakar

I am an undergraduate student (First Degree) preparing to write my project,this video and explanation had shed more light to me thanks for your efforts keep it up.

Synthia Atieno

Very useful. I am grateful.

belina nambeya

this is a very a good guidance on research proposal, for sure i have learnt something

Wonderful guidelines for writing a research proposal, I am a student of m.phil( education), this guideline is suitable for me. Thanks

You’re welcome 🙂

Marjorie

Thank you, this was so helpful.

Amitash Degan

A really great and insightful video. It opened my eyes as to how to write a research paper. I would like to receive more guidance for writing my research paper from your esteemed faculty.

Glaudia Njuguna

Thank you, great insights

Thank you, great insights, thank you so much, feeling edified

Yebirgual

Wow thank you, great insights, thanks a lot

Roseline Soetan

Thank you. This is a great insight. I am a student preparing for a PhD program. I am requested to write my Research Proposal as part of what I am required to submit before my unconditional admission. I am grateful having listened to this video which will go a long way in helping me to actually choose a topic of interest and not just any topic as well as to narrow down the topic and be specific about it. I indeed need more of this especially as am trying to choose a topic suitable for a DBA am about embarking on. Thank you once more. The video is indeed helpful.

Rebecca

Have learnt a lot just at the right time. Thank you so much.

laramato ikayo

thank you very much ,because have learn a lot things concerning research proposal and be blessed u for your time that you providing to help us

Cheruiyot M Kipyegon

Hi. For my MSc medical education research, please evaluate this topic for me: Training Needs Assessment of Faculty in Medical Training Institutions in Kericho and Bomet Counties

Rebecca

I have really learnt a lot based on research proposal and it’s formulation

Arega Berlie

Thank you. I learn much from the proposal since it is applied

Siyanda

Your effort is much appreciated – you have good articulation.

You have good articulation.

Douglas Eliaba

I do applaud your simplified method of explaining the subject matter, which indeed has broaden my understanding of the subject matter. Definitely this would enable me writing a sellable research proposal.

Weluzani

This really helping

Roswitta

Great! I liked your tutoring on how to find a research topic and how to write a research proposal. Precise and concise. Thank you very much. Will certainly share this with my students. Research made simple indeed.

Alice Kuyayama

Thank you very much. I an now assist my students effectively.

Thank you very much. I can now assist my students effectively.

Abdurahman Bayoh

I need any research proposal

Silverline

Thank you for these videos. I will need chapter by chapter assistance in writing my MSc dissertation

Nosi

Very helpfull

faith wugah

the videos are very good and straight forward

Imam

thanks so much for this wonderful presentations, i really enjoyed it to the fullest wish to learn more from you

Bernie E. Balmeo

Thank you very much. I learned a lot from your lecture.

Ishmael kwame Appiah

I really enjoy the in-depth knowledge on research proposal you have given. me. You have indeed broaden my understanding and skills. Thank you

David Mweemba

interesting session this has equipped me with knowledge as i head for exams in an hour’s time, am sure i get A++

Andrea Eccleston

This article was most informative and easy to understand. I now have a good idea of how to write my research proposal.

Thank you very much.

Georgina Ngufan

Wow, this literature is very resourceful and interesting to read. I enjoyed it and I intend reading it every now then.

Charity

Thank you for the clarity

Mondika Solomon

Thank you. Very helpful.

BLY

Thank you very much for this essential piece. I need 1o1 coaching, unfortunately, your service is not available in my country. Anyways, a very important eye-opener. I really enjoyed it. A thumb up to Gradcoach

Md Moneruszzaman Kayes

What is JAM? Please explain.

Gentiana

Thank you so much for these videos. They are extremely helpful! God bless!

azeem kakar

very very wonderful…

Koang Kuany Bol Nyot

thank you for the video but i need a written example

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Indian J Anaesth
  • v.60(9); 2016 Sep

How to write a research proposal?

Department of Anaesthesiology, Bangalore Medical College and Research Institute, Bengaluru, Karnataka, India

Devika Rani Duggappa

Writing the proposal of a research work in the present era is a challenging task due to the constantly evolving trends in the qualitative research design and the need to incorporate medical advances into the methodology. The proposal is a detailed plan or ‘blueprint’ for the intended study, and once it is completed, the research project should flow smoothly. Even today, many of the proposals at post-graduate evaluation committees and application proposals for funding are substandard. A search was conducted with keywords such as research proposal, writing proposal and qualitative using search engines, namely, PubMed and Google Scholar, and an attempt has been made to provide broad guidelines for writing a scientifically appropriate research proposal.

INTRODUCTION

A clean, well-thought-out proposal forms the backbone for the research itself and hence becomes the most important step in the process of conduct of research.[ 1 ] The objective of preparing a research proposal would be to obtain approvals from various committees including ethics committee [details under ‘Research methodology II’ section [ Table 1 ] in this issue of IJA) and to request for grants. However, there are very few universally accepted guidelines for preparation of a good quality research proposal. A search was performed with keywords such as research proposal, funding, qualitative and writing proposals using search engines, namely, PubMed, Google Scholar and Scopus.

Five ‘C’s while writing a literature review

An external file that holds a picture, illustration, etc.
Object name is IJA-60-631-g001.jpg

BASIC REQUIREMENTS OF A RESEARCH PROPOSAL

A proposal needs to show how your work fits into what is already known about the topic and what new paradigm will it add to the literature, while specifying the question that the research will answer, establishing its significance, and the implications of the answer.[ 2 ] The proposal must be capable of convincing the evaluation committee about the credibility, achievability, practicality and reproducibility (repeatability) of the research design.[ 3 ] Four categories of audience with different expectations may be present in the evaluation committees, namely academic colleagues, policy-makers, practitioners and lay audiences who evaluate the research proposal. Tips for preparation of a good research proposal include; ‘be practical, be persuasive, make broader links, aim for crystal clarity and plan before you write’. A researcher must be balanced, with a realistic understanding of what can be achieved. Being persuasive implies that researcher must be able to convince other researchers, research funding agencies, educational institutions and supervisors that the research is worth getting approval. The aim of the researcher should be clearly stated in simple language that describes the research in a way that non-specialists can comprehend, without use of jargons. The proposal must not only demonstrate that it is based on an intelligent understanding of the existing literature but also show that the writer has thought about the time needed to conduct each stage of the research.[ 4 , 5 ]

CONTENTS OF A RESEARCH PROPOSAL

The contents or formats of a research proposal vary depending on the requirements of evaluation committee and are generally provided by the evaluation committee or the institution.

In general, a cover page should contain the (i) title of the proposal, (ii) name and affiliation of the researcher (principal investigator) and co-investigators, (iii) institutional affiliation (degree of the investigator and the name of institution where the study will be performed), details of contact such as phone numbers, E-mail id's and lines for signatures of investigators.

The main contents of the proposal may be presented under the following headings: (i) introduction, (ii) review of literature, (iii) aims and objectives, (iv) research design and methods, (v) ethical considerations, (vi) budget, (vii) appendices and (viii) citations.[ 4 ]

Introduction

It is also sometimes termed as ‘need for study’ or ‘abstract’. Introduction is an initial pitch of an idea; it sets the scene and puts the research in context.[ 6 ] The introduction should be designed to create interest in the reader about the topic and proposal. It should convey to the reader, what you want to do, what necessitates the study and your passion for the topic.[ 7 ] Some questions that can be used to assess the significance of the study are: (i) Who has an interest in the domain of inquiry? (ii) What do we already know about the topic? (iii) What has not been answered adequately in previous research and practice? (iv) How will this research add to knowledge, practice and policy in this area? Some of the evaluation committees, expect the last two questions, elaborated under a separate heading of ‘background and significance’.[ 8 ] Introduction should also contain the hypothesis behind the research design. If hypothesis cannot be constructed, the line of inquiry to be used in the research must be indicated.

Review of literature

It refers to all sources of scientific evidence pertaining to the topic in interest. In the present era of digitalisation and easy accessibility, there is an enormous amount of relevant data available, making it a challenge for the researcher to include all of it in his/her review.[ 9 ] It is crucial to structure this section intelligently so that the reader can grasp the argument related to your study in relation to that of other researchers, while still demonstrating to your readers that your work is original and innovative. It is preferable to summarise each article in a paragraph, highlighting the details pertinent to the topic of interest. The progression of review can move from the more general to the more focused studies, or a historical progression can be used to develop the story, without making it exhaustive.[ 1 ] Literature should include supporting data, disagreements and controversies. Five ‘C's may be kept in mind while writing a literature review[ 10 ] [ Table 1 ].

Aims and objectives

The research purpose (or goal or aim) gives a broad indication of what the researcher wishes to achieve in the research. The hypothesis to be tested can be the aim of the study. The objectives related to parameters or tools used to achieve the aim are generally categorised as primary and secondary objectives.

Research design and method

The objective here is to convince the reader that the overall research design and methods of analysis will correctly address the research problem and to impress upon the reader that the methodology/sources chosen are appropriate for the specific topic. It should be unmistakably tied to the specific aims of your study.

In this section, the methods and sources used to conduct the research must be discussed, including specific references to sites, databases, key texts or authors that will be indispensable to the project. There should be specific mention about the methodological approaches to be undertaken to gather information, about the techniques to be used to analyse it and about the tests of external validity to which researcher is committed.[ 10 , 11 ]

The components of this section include the following:[ 4 ]

Population and sample

Population refers to all the elements (individuals, objects or substances) that meet certain criteria for inclusion in a given universe,[ 12 ] and sample refers to subset of population which meets the inclusion criteria for enrolment into the study. The inclusion and exclusion criteria should be clearly defined. The details pertaining to sample size are discussed in the article “Sample size calculation: Basic priniciples” published in this issue of IJA.

Data collection

The researcher is expected to give a detailed account of the methodology adopted for collection of data, which include the time frame required for the research. The methodology should be tested for its validity and ensure that, in pursuit of achieving the results, the participant's life is not jeopardised. The author should anticipate and acknowledge any potential barrier and pitfall in carrying out the research design and explain plans to address them, thereby avoiding lacunae due to incomplete data collection. If the researcher is planning to acquire data through interviews or questionnaires, copy of the questions used for the same should be attached as an annexure with the proposal.

Rigor (soundness of the research)

This addresses the strength of the research with respect to its neutrality, consistency and applicability. Rigor must be reflected throughout the proposal.

It refers to the robustness of a research method against bias. The author should convey the measures taken to avoid bias, viz. blinding and randomisation, in an elaborate way, thus ensuring that the result obtained from the adopted method is purely as chance and not influenced by other confounding variables.

Consistency

Consistency considers whether the findings will be consistent if the inquiry was replicated with the same participants and in a similar context. This can be achieved by adopting standard and universally accepted methods and scales.

Applicability

Applicability refers to the degree to which the findings can be applied to different contexts and groups.[ 13 ]

Data analysis

This section deals with the reduction and reconstruction of data and its analysis including sample size calculation. The researcher is expected to explain the steps adopted for coding and sorting the data obtained. Various tests to be used to analyse the data for its robustness, significance should be clearly stated. Author should also mention the names of statistician and suitable software which will be used in due course of data analysis and their contribution to data analysis and sample calculation.[ 9 ]

Ethical considerations

Medical research introduces special moral and ethical problems that are not usually encountered by other researchers during data collection, and hence, the researcher should take special care in ensuring that ethical standards are met. Ethical considerations refer to the protection of the participants' rights (right to self-determination, right to privacy, right to autonomy and confidentiality, right to fair treatment and right to protection from discomfort and harm), obtaining informed consent and the institutional review process (ethical approval). The researcher needs to provide adequate information on each of these aspects.

Informed consent needs to be obtained from the participants (details discussed in further chapters), as well as the research site and the relevant authorities.

When the researcher prepares a research budget, he/she should predict and cost all aspects of the research and then add an additional allowance for unpredictable disasters, delays and rising costs. All items in the budget should be justified.

Appendices are documents that support the proposal and application. The appendices will be specific for each proposal but documents that are usually required include informed consent form, supporting documents, questionnaires, measurement tools and patient information of the study in layman's language.

As with any scholarly research paper, you must cite the sources you used in composing your proposal. Although the words ‘references and bibliography’ are different, they are used interchangeably. It refers to all references cited in the research proposal.

Successful, qualitative research proposals should communicate the researcher's knowledge of the field and method and convey the emergent nature of the qualitative design. The proposal should follow a discernible logic from the introduction to presentation of the appendices.

Financial support and sponsorship

Conflicts of interest.

There are no conflicts of interest.

  • Skip to main content
  • Skip to primary site menu
  • Skip to primary site search

importance of evaluating research proposal

Research at Brown

Writing an evaluation plan.

An evaluation plan is an integral part of a grant proposal that provides information to improve a project during development and implementation.

For small projects, the Office of the Vice President for Research can help you develop a simple evaluation plan. If you are writing a proposal for larger center grant, using a professional external evaluator is recommended. We can provide recommendations of external evaluators; please contact [email protected] ; for BioMed faculty visit the BioMed Evaluation Services webpage .

Do all grant proposals require an evaluation plan?

Not all grant proposals require an evaluation plan; however, many program announcements and funding opportunities stipulate and evaluation strategy with specific milestones are important elements that should be considered. If an evaluation plan is required, it will generally be listed in the program announcement. Most often, larger, more involved grant proposals will require an evaluation plan, while a smaller, single-investigator proposals will not. If you are unsure whether your proposal requires an evaluation plan, please contact us.

It is worth noting there is a difference between evaluation and research although there are several commonalities. Most simply:

  • Research generalizes; Evaluation particularizes, 
  • Research is designed to prove something; Evaluation is designed to improve something
  • Research provides the basis for drawing conclusions; Evaluation provides a basis for decision making
  • Research--how it works; Evaluation--how well it works
  • Research is about what is; Evaluation is about what is valuable

There are two types of evaluation typically requested by funders--formative and summative—and which you use is largely dictated by the purpose of the evaluation. Do you want to prove that you achieved the outcomes as intended (summative) or are you doing evaluation to monitor if you are doing what you said you would in your grant application (formative)? Or both? We can help you prepare and review both types of evaluations outlined below.

Formative or Process Evaluation does the following:

  • Assesses initial and ongoing project activities
  • Begins during project development and continues through implementation
  • Provides new and sometimes unanticipated insights into improving the outcomes of the project
  • Involves review by the principal investigator, the steering or governance committee, and either an internal or external evaluator (depending on grant requirements)

Summative or Outcomes Evaluation does the following:

  • Assesses the quality and success of a project in reaching stated goals
  • Presents the information collected for project activities and outcomes
  • Takes place after the completion of the project
  • Involves review by the principal investigator, the steering or governance committee, either an internal or external evaluator, and the program director of the funding agency
  • All evaluation plans should identify both participants (those directly involved in the project) and stakeholders (those otherwise invested by credibility, control or other capital), and should include the relevant items developed in the evaluation process.

What does the evaluation process entail?

The evaluation process can be broken down into a series of steps, from preparation to implementation and interpretation.

  • Develop a conceptual model of the project and identify key evaluation points. This ensures that all participants and stakeholders understand the project's structure and expected outcomes, and helps focus on the project’s most important elements.
  • Create evaluation questions and define measurable outcomes. Outcomes may be divided into short-term and long-term, or defined by the more immediate number of people affected by the project versus the overall changes that might not occur until after the project’s completion.
  • Develop an appropriate evaluation design. A successful evaluation both highlights the most useful information about the project’s objectives and addresses its shortcomings. In developing an evaluation design, you should first determine who will be studied and when, and then select a methodological approach and data collection instruments. The NSF-sponsored Online Evaluation Resource Library provides step-by-step instructions for developing an evaluation plan.
  • Collect data.
  • Analyze data and present to interested audiences.

Explore Brown University

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

importance of evaluating research proposal

Home Market Research

Evaluation Research: Definition, Methods and Examples

Evaluation Research

Content Index

  • What is evaluation research
  • Why do evaluation research

Quantitative methods

Qualitative methods.

  • Process evaluation research question examples
  • Outcome evaluation research question examples

What is evaluation research?

Evaluation research, also known as program evaluation, refers to research purpose instead of a specific method. Evaluation research is the systematic assessment of the worth or merit of time, money, effort and resources spent in order to achieve a goal.

Evaluation research is closely related to but slightly different from more conventional social research . It uses many of the same methods used in traditional social research, but because it takes place within an organizational context, it requires team skills, interpersonal skills, management skills, political smartness, and other research skills that social research does not need much. Evaluation research also requires one to keep in mind the interests of the stakeholders.

Evaluation research is a type of applied research, and so it is intended to have some real-world effect.  Many methods like surveys and experiments can be used to do evaluation research. The process of evaluation research consisting of data analysis and reporting is a rigorous, systematic process that involves collecting data about organizations, processes, projects, services, and/or resources. Evaluation research enhances knowledge and decision-making, and leads to practical applications.

LEARN ABOUT: Action Research

Why do evaluation research?

The common goal of most evaluations is to extract meaningful information from the audience and provide valuable insights to evaluators such as sponsors, donors, client-groups, administrators, staff, and other relevant constituencies. Most often, feedback is perceived value as useful if it helps in decision-making. However, evaluation research does not always create an impact that can be applied anywhere else, sometimes they fail to influence short-term decisions. It is also equally true that initially, it might seem to not have any influence, but can have a delayed impact when the situation is more favorable. In spite of this, there is a general agreement that the major goal of evaluation research should be to improve decision-making through the systematic utilization of measurable feedback.

Below are some of the benefits of evaluation research

  • Gain insights about a project or program and its operations

Evaluation Research lets you understand what works and what doesn’t, where we were, where we are and where we are headed towards. You can find out the areas of improvement and identify strengths. So, it will help you to figure out what do you need to focus more on and if there are any threats to your business. You can also find out if there are currently hidden sectors in the market that are yet untapped.

  • Improve practice

It is essential to gauge your past performance and understand what went wrong in order to deliver better services to your customers. Unless it is a two-way communication, there is no way to improve on what you have to offer. Evaluation research gives an opportunity to your employees and customers to express how they feel and if there’s anything they would like to change. It also lets you modify or adopt a practice such that it increases the chances of success.

  • Assess the effects

After evaluating the efforts, you can see how well you are meeting objectives and targets. Evaluations let you measure if the intended benefits are really reaching the targeted audience and if yes, then how effectively.

  • Build capacity

Evaluations help you to analyze the demand pattern and predict if you will need more funds, upgrade skills and improve the efficiency of operations. It lets you find the gaps in the production to delivery chain and possible ways to fill them.

Methods of evaluation research

All market research methods involve collecting and analyzing the data, making decisions about the validity of the information and deriving relevant inferences from it. Evaluation research comprises of planning, conducting and analyzing the results which include the use of data collection techniques and applying statistical methods.

Some of the evaluation methods which are quite popular are input measurement, output or performance measurement, impact or outcomes assessment, quality assessment, process evaluation, benchmarking, standards, cost analysis, organizational effectiveness, program evaluation methods, and LIS-centered methods. There are also a few types of evaluations that do not always result in a meaningful assessment such as descriptive studies, formative evaluations, and implementation analysis. Evaluation research is more about information-processing and feedback functions of evaluation.

These methods can be broadly classified as quantitative and qualitative methods.

The outcome of the quantitative research methods is an answer to the questions below and is used to measure anything tangible.

  • Who was involved?
  • What were the outcomes?
  • What was the price?

The best way to collect quantitative data is through surveys , questionnaires , and polls . You can also create pre-tests and post-tests, review existing documents and databases or gather clinical data.

Surveys are used to gather opinions, feedback or ideas of your employees or customers and consist of various question types . They can be conducted by a person face-to-face or by telephone, by mail, or online. Online surveys do not require the intervention of any human and are far more efficient and practical. You can see the survey results on dashboard of research tools and dig deeper using filter criteria based on various factors such as age, gender, location, etc. You can also keep survey logic such as branching, quotas, chain survey, looping, etc in the survey questions and reduce the time to both create and respond to the donor survey . You can also generate a number of reports that involve statistical formulae and present data that can be readily absorbed in the meetings. To learn more about how research tool works and whether it is suitable for you, sign up for a free account now.

Create a free account!

Quantitative data measure the depth and breadth of an initiative, for instance, the number of people who participated in the non-profit event, the number of people who enrolled for a new course at the university. Quantitative data collected before and after a program can show its results and impact.

The accuracy of quantitative data to be used for evaluation research depends on how well the sample represents the population, the ease of analysis, and their consistency. Quantitative methods can fail if the questions are not framed correctly and not distributed to the right audience. Also, quantitative data do not provide an understanding of the context and may not be apt for complex issues.

Learn more: Quantitative Market Research: The Complete Guide

Qualitative research methods are used where quantitative methods cannot solve the research problem , i.e. they are used to measure intangible values. They answer questions such as

  • What is the value added?
  • How satisfied are you with our service?
  • How likely are you to recommend us to your friends?
  • What will improve your experience?

LEARN ABOUT: Qualitative Interview

Qualitative data is collected through observation, interviews, case studies, and focus groups. The steps for creating a qualitative study involve examining, comparing and contrasting, and understanding patterns. Analysts conclude after identification of themes, clustering similar data, and finally reducing to points that make sense.

Observations may help explain behaviors as well as the social context that is generally not discovered by quantitative methods. Observations of behavior and body language can be done by watching a participant, recording audio or video. Structured interviews can be conducted with people alone or in a group under controlled conditions, or they may be asked open-ended qualitative research questions . Qualitative research methods are also used to understand a person’s perceptions and motivations.

LEARN ABOUT:  Social Communication Questionnaire

The strength of this method is that group discussion can provide ideas and stimulate memories with topics cascading as discussion occurs. The accuracy of qualitative data depends on how well contextual data explains complex issues and complements quantitative data. It helps get the answer of “why” and “how”, after getting an answer to “what”. The limitations of qualitative data for evaluation research are that they are subjective, time-consuming, costly and difficult to analyze and interpret.

Learn more: Qualitative Market Research: The Complete Guide

Survey software can be used for both the evaluation research methods. You can use above sample questions for evaluation research and send a survey in minutes using research software. Using a tool for research simplifies the process right from creating a survey, importing contacts, distributing the survey and generating reports that aid in research.

Examples of evaluation research

Evaluation research questions lay the foundation of a successful evaluation. They define the topics that will be evaluated. Keeping evaluation questions ready not only saves time and money, but also makes it easier to decide what data to collect, how to analyze it, and how to report it.

Evaluation research questions must be developed and agreed on in the planning stage, however, ready-made research templates can also be used.

Process evaluation research question examples:

  • How often do you use our product in a day?
  • Were approvals taken from all stakeholders?
  • Can you report the issue from the system?
  • Can you submit the feedback from the system?
  • Was each task done as per the standard operating procedure?
  • What were the barriers to the implementation of each task?
  • Were any improvement areas discovered?

Outcome evaluation research question examples:

  • How satisfied are you with our product?
  • Did the program produce intended outcomes?
  • What were the unintended outcomes?
  • Has the program increased the knowledge of participants?
  • Were the participants of the program employable before the course started?
  • Do participants of the program have the skills to find a job after the course ended?
  • Is the knowledge of participants better compared to those who did not participate in the program?

MORE LIKE THIS

importance of evaluating research proposal

QuestionPro Thrive: A Space to Visualize & Share the Future of Technology

Jun 18, 2024

importance of evaluating research proposal

Relationship NPS Fails to Understand Customer Experiences — Tuesday CX

CX Platforms

CX Platform: Top 13 CX Platforms to Drive Customer Success

Jun 17, 2024

importance of evaluating research proposal

How to Know Whether Your Employee Initiatives are Working

Other categories.

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence

How to Evaluate a Study

Not all studies should be treated equally. Below are a few key factors to consider when evaluating a study’s conclusions.

  • Has the study been reviewed by other experts ? Peer-review, the process by which a study is sent to other researchers in a particular field for their notes and thoughts, is essential in evaluating a study’s findings. Since most consumers and members of the media are not well-trained enough to evaluate a study’s design and researcher’s findings, studies that pass muster with other researchers and are accepted for publication in prestigious journals are generally more trustworthy.
  • Do other experts agree? Have other experts spoken out against the study’s findings? Who are these other experts and are their criticisms valid?
  • Are there reasons to doubt the findings? One of the most important items to keep in mind when reviewing studies is that correlation does not prove causation. For instance, just because there is an association between eating blueberries and weighing less does not mean that eating blueberries will make you lose weight. Researchers should look for other explanations for their findings, known as “confounding variables.” In this instance, they should consider that people who tend to eat blueberries also tend to exercise more and consume fewer calories overall.
  • How do the conclusions fit with other studies? It’s rare that a single study is enough to overturn the preponderance of research offering a different conclusion. Though studies that buck the established notion are not necessarily wrong, they should be scrutinized closely to ensure that their findings are accurate.
  • How big was the study? Sample size matters. The more patients or subjects involved in a study, the more likely it is that the study’s conclusions aren’t merely due to random chance and are, in fact, statistically significant.
  • Are there any major flaws in the study’s design? This is one of the most difficult steps if you aren’t an expert in a particular field, but there are ways to look for bias. For example, was the study a “double-blind” experiment or were the researchers aware of which subjects were the control set?
  • Have the researchers identified any flaws or limitations with their researc h? Often buried in the conclusion, researchers acknowledge limitations or possible other theories for their results. Because the universities, government agencies, or other organizations who’ve funded and promoted the study often want to highlight the boldest conclusion possible, these caveats can be overlooked. However, they’re important when considering how important the study’s conclusions really are.
  • Have the findings been replicated? With growing headlines of academic fraud and leading journals forced to retract articles based on artificial results, replication of results is increasingly important to judge the merit of a study’s findings. If other researchers can replicate an experiment and come to a similar conclusion, it’s much easier to trust those results than those that have only been peer reviewed.

importance of evaluating research proposal

Readex ResearchEvaluating Research Proposals - Readex Research

Evaluating Research Proposals

Comparing proposals “apples-to-apples” is crucial to establishing which one will best meet your needs. Consider these ideas to help you focus on the details that contribute to a successful survey.

Make sure the proposal responds to your objectives.

The proposal process begins well before you ask any research firm for quote. The process really begins with the discussions you and your team have about objectives. What are your goals? What are the decisions you want to make when the project is done and you have data in hand?

Once you have a solid vision of the survey, then it’s time to start talking with potential partners Throughout your conversations, take note: Do the various firms ask you specific questions about your objectives, the group of people you’d like to survey, and your ultimate goals? Do they, indeed, ask about decisions that you wish to make? Details regarding your specific need should always be front and center during the conversations.

Sampling plan.

When reviewing the sampling plan, make sure the proposal mentions sample size, response rate estimates, number of responses, and maximum sampling error. If you’re unsure of the impact these figures have on the quality of your results, ask the researcher. They should be able to explain them in terms you can understand.

Questionnaire.

The quantity and types of information sought from respondents will impact cost. Quantity encompasses the number of questions and number of variables to process. Type refers to how the questions will be processed, the data entry involved and whether all or just some data will be cleaned.

No evaluation is complete until you know the approximate number and types of questions planned for the survey. The number of open-ended questions should be included as well because open-ended questions that capture verbatim responses can impact the response rate and possibly the price of your survey, especially if done by mail.

In addition, make sure the proposal clearly indicates who will develop the questionnaire content. Also, determine if it includes enough collaboration time to be sufficiently customized to meet your particular needs.

Data collection approach.

For online surveys paying attention to the data collection series and who is responsible for sending survey invitations. Multiple emails to sample members can encourage response. As well, the invitation process should be sensitive to data privacy issues such as those indicated by GDPR and others. Proposals for mailed surveys should clearly outline the data collection series and each component of the survey kit.

Data processing.

Any proposal you receive should highlight the steps the research company will take to make sure that the data is accurate and representative. Depending on the type of survey, checking logic, consistency, and outliers can take a significant amount of time. You must have some process noted to identify inconsistent answers for surveys that collect a significant amount of numerical data (salary survey, market studies, budget planning). Finally, some percentage of mailed surveys need to be verified for data entry accuracy.

A straightforward analysis of survey data can meet many objectives. In other cases, a multivariate statistical analysis will provide deeper insights to achieve your objectives— making results easier to use. If your objectives include learning about separate segments of your circulation, crosstabulations should be specified.

Deliverables.

A variety of reporting options exist for a survey. These include but are not limited to data tables, a summary of the results, in-depth analysis, and graphed presentations. As a result, you need to understand exactly what you’ll receive following your survey and in what format.

No surprises!

Make sure the proposal covers all the bases: what you need to do and provide, what the firm will do when they will do it and how much it will cost. There should be no surprises in what you need to supply. No “you need how much letterhead and envelopes?” a week before your survey is scheduled to mail. Review the price carefully and understand what it includes and doesn’t include. As with many things in life, you usually get what you pay for.

Share this:

Related posts:, notes on the pre- and post-survey.

Notes on the Pre- and Post-Survey The Pre-Post survey is used to look at how things may change over time. It may be how brand awareness levels change after a new ad campaign is introduced or how opinions of a political candidate move after a speech. The catalyst to potential change, sometimes called the event […]

The Importance of Questionnaire Design

The Importance of Questionnaire Design Planning a survey requires many steps and decisions along the way: “How many people do I need to survey? How am I going to distribute the survey?” And, while people often figure out what questions they want to ask, many overlook the importance of expert, unbiased questionnaire design. When you […]

Will Color Printing Give Your Survey a Boost?

Will Color Printing Give Your Survey a Boost? Occasionally we are asked if color printing (versus black and white) is better to use as part of a survey mailing? Will this treatment generate more attention and ultimately a better response? Our Opinion: If you’re looking to use color to boost your survey’s response rate, it […]

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed .

  • Corpus ID: 158111429

The Importance of Research Proposal

  • Published 15 March 2015

One Citation

Improving the science process skills of science, technology, engineering students through personality-based approach, 4 references, the research student's guide to success, doing your research project: a guide for first-time researchers, research methodology: a step-by-step guide for beginners, research methodology: methods and techniques, related papers.

Showing 1 through 3 of 0 Related Papers

  • Skip to primary navigation
  • Skip to main content
  • Skip to footer

logo

Eduinput- Online tutoring platform for Math, chemistry, Biology Physics

An online learning platform for Mcat, JEE, NEET and UPSC students

Research Proposal-Components, Types, Topics, Importance, and Applications

importance of evaluating research proposal

Table of Contents

What is Research Proposal?

A research proposal is a document that outlines the plan and rationale for conducting a research study. It serves as a blueprint for the entire research process and helps researchers communicate their objectives, methods, and expected outcomes effectively.

Components of a Research Proposal

The key components of a research proposal include:

  • Title (Concise and informative title that reflects the essence of the research study)
  • Abstract (Brief summary of the research proposal, highlighting its key objectives, methods, and expected outcomes)
  • Introduction (Overview of the research topic, highlighting its significance and relevance)
  • Research Objectives/Questions/Hypotheses (Clear and specific statements that outline the purpose of the study)
  • Literature Review (Critical analysis of existing scholarly works related to the research topic)
  • Research Methodology (Detailed explanation of the research design, data collection methods, and analysis techniques)
  • Significance and Expected Outcomes (Explanation of the potential impact of the research and the expected results)
  • Research Timeline (Proposed timeline that outlines the key milestones and activities of the research study)
  • References (Comprehensive list of the sources cited in the research proposal)

Also learn about Action Proposal

Types of Research Proposals

Research proposals can vary depending on the field of study and the intended audience. Different types of research proposals can help you determine which format is most appropriate for your specific needs.

Whether responding to a solicitation, submitting an unsolicited proposal, or seeking continuation or renewal funding, each proposal type requires careful consideration and alignment with the sponsor’s objectives and guidelines.

Here are some common types of research proposals:

image showing Types of Research Proposals

1.      Solicited Proposals

Solicited proposals are submitted in response to a specific call or request issued by a sponsor. These calls, often referred to as Request for Proposals (RFP) or Request for Quotations (RFQ), outline the sponsor’s specific requirements, objectives, and evaluation criteria.

Solicited proposals must adhere to the provided guidelines and may include technical specifications and terms and conditions set by the sponsor. Broad Agency Announcements (BAAs) are similar but are not considered formal solicitations.

2.      Unsolicited Proposals

Unsolicited proposals are submitted to a sponsor without a specific request or solicitation. In these cases, the investigator believes that the sponsor has an interest in the subject matter. Unsolicited proposals require the researcher to present a compelling case for the significance and relevance of their research, convincing the sponsor of the value and potential impact of the proposed study.

3.      Preproposals

Preproposals are typically requested by sponsors who want to streamline the application process and minimize the effort required by applicants. Preproposals are in the form of a letter of intent or a brief abstract that outlines the main objectives and approach of the research.

After reviewing the preproposal, the sponsor informs the investigator if a full proposal is warranted. This process allows both the investigator and the sponsor to determine if it is worthwhile to proceed with a complete proposal submission.

4.      Continuation or Non-competing Proposals

Continuation or non-competing proposals are submitted for multi-year projects that have already received funding from the sponsor for an initial period, typically one year. These proposals confirm the original proposal’s scope, objectives, and funding requirements for the subsequent period.

The sponsor’s decision to continue funding is contingent upon satisfactory work progress and the availability of funds.

5.      Renewal or Competing Proposals

Renewal or competing proposals are submitted when an existing project is nearing its end, and the investigator requests continued support for the research. From the sponsor’s perspective, these proposals are treated similarly to unsolicited proposals, requiring a thorough presentation of the project’s achievements, impact, and future plans.

Renewal proposals must demonstrate the ongoing relevance and value of the research, highlighting the need for further funding to continue the project’s objectives.

6.      Grant Proposals

Grant proposals are submitted to funding agencies, such as government bodies, foundations, or organizations, to secure financial support for research projects.

These proposals typically require a detailed description of the research project, including the objectives, methodology, expected outcomes, budget, and timeline. Grant proposals often follow specific guidelines provided by the funding agency.

7.      Dissertation Proposals

Dissertation proposals are submitted by doctoral students as part of their research journey. These proposals outline the research topic, objectives, theoretical framework, methodology, and anticipated contributions to the field.

Dissertation proposals also typically include a literature review to establish the context and significance of the proposed research.

8.      Project Proposals

Project proposals are common in academic and professional settings where research projects are undertaken. These proposals outline the objectives, scope, methodology, timeline, and expected outcomes of the project.

Project proposals often include details about the project team, resources required, and the potential impact of the project on stakeholders.

9.      Thesis Proposals

Similar to dissertation proposals, thesis proposals are submitted by students pursuing a master’s degree. These proposals present the research topic, objectives, methodology, and expected contributions to the field.

Thesis proposals also include a literature review that highlights the existing knowledge and research gaps in the chosen area of study.

10.  Research Funding Proposals

Research funding proposals are typically submitted by researchers or research teams within academic institutions or research organizations. These proposals aim to secure funding for ongoing or new research projects.

Research funding proposals often include a detailed description of the research objectives, methodology, expected outcomes, budget, and timeline. They may also require a justification for the need for funding and a demonstration of the potential impact of the research.

11.  Feasibility Study Proposals

Feasibility study proposals are used to assess the practicality and viability of a research project before its full implementation. These proposals outline the research objectives, methodology, timeline, and expected outcomes, with a particular focus on evaluating the feasibility of conducting the research.

Feasibility study proposals often involve preliminary data collection or analysis to inform the decision-making process.

12.  Program Evaluation Proposals

Program evaluation proposals are designed to assess the effectiveness, efficiency, and impact of a specific program, intervention, or policy. These proposals typically outline the evaluation objectives, methodology, data collection methods, analysis techniques, and expected outcomes.

Program evaluation proposals often require collaboration with relevant stakeholders and may involve both qualitative and quantitative research methods.

Steps in Developing a Research Proposal

Following steps are involved in Developing a Research Proposal:

  • First of all identify a research topic . Select a research topic that aligns with your interests, expertise, and the existing gaps in knowledge.
  • Review existing literature . Conduct a comprehensive literature review to understand the current state of knowledge in your research area and identify research gaps.
  • Formulate research objectives/questions/hypotheses . Clearly define the research objectives, questions, or hypotheses that you aim to address in your study.
  • Design research methodology . Determine the most appropriate research design, data collection methods, and analysis techniques for your study.
  • Develop a research timeline . Create a timeline that outlines the key activities and milestones of your research project, ensuring a realistic and achievable plan.
  • Consider ethical considerations and research limitations . Address any ethical concerns associated with your research, such as participant consent and data privacy. Also, acknowledge the potential limitations of your study.
  • Write the research proposal . Compile all the components of the research proposal into a cohesive document, ensuring clarity, coherence, and adherence to guidelines.

Selecting Research Proposal Topics

Selecting a suitable research topic is important for the success of your research proposal. Consider the following tips when choosing your research topic:

  • Choose an interesting topic . Select a research area aligned with your passions, experiences, or career aspirations to stay engaged and motivated throughout the process.
  • Narrow down your topic . Refine your research question to a specific aspect or subtopic to maintain focus and avoid overwhelming amounts of information.
  • Familiarize yourself with existing literature to gain insights, identify gaps in knowledge, and refine the scope of your research.
  • Tailor your topic selection to meet the specific requirements and expectations outlined in your research assignment.
  • Consult with professors or TAs for guidance, insights, and recommendations related to potential research topics within your field of study.
  • Discuss your research ideas with classmates or friends to gain different perspectives, identify new angles, and prompt innovative approaches to your topic.
  • Consider the “ who, what, when, where, and why ” questions.
  • Why did you choose the topic? What aspects of the topic interest you? Do you have a particular opinion or stance on the issues involved?
  • Who are the key information providers on this topic? Are there specific organizations, institutions, or experts affiliated with the topic?
  • What are the major questions, debates, or issues surrounding the topic? Are there different viewpoints or perspectives to consider?
  • Where is your topic significant? Does it have local, national, or international implications? Are there specific geographical regions or communities affected by the topic?
  • When is/ was your topic important? Is it a current event or a historical issue? Are you interested in comparing your topic across different time periods?

Examples of Research Proposal Topics

  • The impact of social media on mental health among adolescents
  • Exploring the effectiveness of mindfulness-based interventions in reducing stress and anxiety
  • Investigating the factors influencing consumer buying behavior in the e-commerce industry
  • Assessing the effects of climate change on agricultural productivity in developing countries
  • Investigating the Relationship between Exercise and Cognitive Function in Older Adults: A Randomized Controlled Trial.
  • Exploring the Role of Artificial Intelligence in Enhancing Customer Experience in the Retail Industry.
  • Understanding the Factors Influencing Employee Job Satisfaction and Engagement in the Workplace.
  • Examining the Impact of Online Learning on Student Performance and Satisfaction in Higher Education.
  • Investigating the Relationship between Parental Involvement and Academic Achievement among Elementary School Students.
  • Analyzing the Effects of Early Childhood Education Programs on Long-term Academic Success and Socioeconomic Outcomes.
  • Exploring the Factors Influencing Consumer Decision-making in Purchasing Organic Food Products.
  • Investigating the Effects of Workplace Diversity on Organizational Performance and Innovation.

Importance and Impact of a Well-Written Research Proposal

Crafting a well-structured and compelling research proposal is essential for several reasons:

  • A well-developed research proposal increases your chances of securing funding from organizations and institutions.
  • Research proposals are often required for academic programs and can contribute to your academic and professional growth.
  • A well-crafted research proposal provides a clear direction and plan for your research study, minimizing ambiguity and ensuring focused efforts.
  • A thorough research proposal increases the likelihood of conducting impactful research that contributes to knowledge and addresses real-world problems.

Applications of Research Proposals

Research proposals serve as essential tools for planning and initiating research projects across various fields. They play a crucial role in academic, scientific, and professional settings. Here are some key applications of research proposals:

Applications of Research Proposals image

  • In academic research , Research proposals are commonly used in academia to secure grants, scholarships, or to gain approval for research projects.
  • Scientific Researchers in various scientific fields use research proposals to obtain funding, collaboration, and ethical clearance.
  • In Business and industry , Research proposals are essential in business settings for conducting market research, product development, and process improvement initiatives.
  • Research proposals enable non-profit organizations to gather data and evidence to support their mission and programs.
  • Research proposals help government agencies gather information for policy development, program evaluation, and decision-making.
  • Research proposals are used in healthcare settings to conduct clinical trials , study disease patterns, and evaluate treatment interventions.
  • Research proposals assist environmental organizations in studying and addressing environmental issues such as climate change, pollution, and conservation.
  • Research proposals are employed in educational settings to study teaching methodologies, curriculum development, and student outcomes.
  • Research proposals are utilized in disciplines such as psychology, sociology, and anthropology to investigate human behavior, social phenomena, and cultural practices.
  • In Technology and innovation , Research proposals support technological advancements by exploring new technologies, improving existing systems, and solving technological challenges.

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Get updates about new courses

NCERT solutions

footer-logo

Join our scholarship program

Click Here to join

9th Class 10th Class 11 Class 12 Class

Join the groups below to get updates.

U.S. flag

Official websites use .gov

A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS

A lock ( ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Performance and Evaluation Office (PEO) - Program Evaluation

Centers for Disease Control and Prevention. Framework for program evaluation in public health. MMWR 1999;48 (No. RR-11)

This Public Health Reports  article highlights the path CDC has taken to foster the use of evaluation. Access this valuable resource to learn more about using evaluation to inform program improvements.

What is program evaluation?

Evaluation: A systematic method for collecting, analyzing, and using data to examine the effectiveness and efficiency of programs and, as importantly, to contribute to continuous program improvement.

Program: Any set of related activities undertaken to achieve an intended outcome; any organized public health action. At CDC, program is defined broadly to include policies; interventions; environmental, systems, and media initiatives; and other efforts. It also encompasses preparedness efforts as well as research, capacity, and infrastructure efforts.

At CDC, effective program evaluation is a systematic way to improve and account for public health actions.

Why evaluate?

  • CDC has a deep and long-standing commitment to the use of data for decision making, as well as the responsibility to describe the outcomes achieved with its public health dollars.
  • Strong program evaluation can help us identify our best investments as well as determine how to establish and sustain them as optimal practice.
  • The goal is to increase the use of evaluation data for continuous program improvement Agency-wide.
We have to have a healthy obsession with impact. To always be asking ourselves what is the real impact of our work on improving health? Dr. Frieden, January 21, 2014

What's the difference between evaluation, research, and monitoring?

  • Evaluation: Purpose is to determine effectiveness of a specific program or model and understand why a program may or may not be working. Goal is to improve programs.
  • Research: Purpose is theory testing and to produce generalizable knowledge. Goal is to contribute to knowledge base.
  • Monitoring: Purpose is to track implementation progress through periodic data collection. Goal is to provide early indications of progress (or lack thereof).
  • Data collection methods and analyses are often similar between research and evaluation.
  • Monitoring and evaluation (M&E) measure and assess performance to help improve performance and achieve results.
Research seeks to prove, evaluation seeks to improve. Michael Quinn Patton, Founder and Director of Utilization-Focused Evaluation

E-mail: [email protected]

To receive email updates about this page, enter your email address:

Exit Notification / Disclaimer Policy

  • The Centers for Disease Control and Prevention (CDC) cannot attest to the accuracy of a non-federal website.
  • Linking to a non-federal website does not constitute an endorsement by CDC or any of its employees of the sponsors or the information and products presented on the website.
  • You will be subject to the destination website's privacy policy when you follow the link.
  • CDC is not responsible for Section 508 compliance (accessibility) on other federal or private website.

IMAGES

  1. What should the research proposal process look like?

    importance of evaluating research proposal

  2. What's The Importance Of Research Proposal

    importance of evaluating research proposal

  3. Research Proposal

    importance of evaluating research proposal

  4. Research proposal presentation

    importance of evaluating research proposal

  5. What's The Importance Of Research Proposal

    importance of evaluating research proposal

  6. PPT

    importance of evaluating research proposal

VIDEO

  1. Creating a research proposal

  2. The importance of evaluating employee performance

  3. Research Profile 1: Why is it so important?

  4. Research Proposal: Drafting a Research Proposal, Evaluating a Research Proposal

  5. What is difference between Research proposal and Research paper/ NTA UGC NET

  6. Proposal 101: What Is A Research Topic?

COMMENTS

  1. Essential Ingredients of a Good Research Proposal for Undergraduate and

    The research journey commences with the selection of a research topic and the preparation of a proposal on the selected topic. Experience has shown that students tend to encounter difficulties in writing research proposals for their supervisors because they do not fully comprehend what constitutes a research proposal.

  2. Research Project Evaluation—Learnings from the PATHWAYS Project

    1.1. Theoretical Framework. The first step has been the clear definition of what is an evaluation strategy or methodology.The term evaluation is defined by the Cambridge Dictionary as the process of judging something's quality, importance, or value, or a report that includes this information [] or in a similar way by the Oxford Dictionary as the making of a judgment about the amount, number ...

  3. Evaluating Research

    Definition: Evaluating Research refers to the process of assessing the quality, credibility, and relevance of a research study or project. This involves examining the methods, data, and results of the research in order to determine its validity, reliability, and usefulness. Evaluating research can be done by both experts and non-experts in the ...

  4. PDF Evaluation of research proposals: the why and what of the ERC's recent

    on research assessment, especially the assessment of researchers (as opposed to research proposals). This is important for this discussion. When we say we judge the excellence of the proposal or researcher, we do not expect the application to satisfy each element of a broad portfolio of demands.

  5. Evaluation of research proposals by peer review panels: broader panels

    To assess research proposals, funders rely on the services of peer experts to review the thousands or perhaps millions of research proposals seeking funding each year. While often associated with scholarly publishing, peer review also includes the ex ante assessment of research grant and fellowship applications ( Abdoul et al. 2012 ).

  6. The critical steps for successful research: The research proposal and

    INTRODUCTION. Creativity and critical thinking are of particular importance in scientific research. Basically, research is original investigation undertaken to gain knowledge and understand concepts in major subject areas of specialization, and includes the generation of ideas and information leading to new or substantially improved scientific insights with relevance to the needs of society.

  7. How to assess research proposals?

    The peer review of research proposals (grants) aims to judge the merit of projects and researchers and enable the best to be contemplated. The director of an institution in the United Kingdom shared on Twitter his struggle in evaluating the numerous proposals received and started a discussion forum from which ideas and suggestions emerged.

  8. How to Write a Research Proposal

    Research proposal examples. Writing a research proposal can be quite challenging, but a good starting point could be to look at some examples. We've included a few for you below. Example research proposal #1: "A Conceptual Framework for Scheduling Constraint Management".

  9. Writing a Research Proposal

    A research proposal is a roadmap that brings the researcher closer to the objectives, takes the research topic from a purely subjective mind, and manifests an objective plan. It shows us what steps we need to take to reach the objective, what questions we should answer, and how much time we need. It is a framework based on which you can perform ...

  10. What Is A Research Proposal? Examples + Template

    The purpose of the research proposal (its job, so to speak) is to convince your research supervisor, committee or university that your research is suitable (for the requirements of the degree program) and manageable (given the time and resource constraints you will face). The most important word here is "convince" - in other words, your ...

  11. Evaluating research: A multidisciplinary approach to assessing research

    Dimensions of the quality of research practice. Evaluation of the quality of research practice is a truly important issue in most scientific domains and at many levels (European Science Foundation, 2012). Increasingly, we are also seeing these assessment efforts across disciplinary and national boundaries.

  12. How to write a research proposal?

    A proposal needs to show how your work fits into what is already known about the topic and what new paradigm will it add to the literature, while specifying the question that the research will answer, establishing its significance, and the implications of the answer. [ 2] The proposal must be capable of convincing the evaluation committee about ...

  13. Writing an Evaluation Plan

    Writing an Evaluation Plan. An evaluation plan is an integral part of a grant proposal that provides information to improve a project during development and implementation. For small projects, the Office of the Vice President for Research can help you develop a simple evaluation plan. If you are writing a proposal for larger center grant, using ...

  14. Evaluation Research: Definition, Methods and Examples

    The process of evaluation research consisting of data analysis and reporting is a rigorous, systematic process that involves collecting data about organizations, processes, projects, services, and/or resources. Evaluation research enhances knowledge and decision-making, and leads to practical applications. LEARN ABOUT: Action Research.

  15. What Is Evaluation?: Perspectives of How Evaluation Differs (or Not

    Source Definition; Suchman (1968, pp. 2-3) [Evaluation applies] the methods of science to action programs in order to obtain objective and valid measures of what such programs are accomplishing.…Evaluation research asks about the kinds of change desired, the means by which this change is to be brought about, and the signs by which such changes can be recognized.

  16. PDF GUIDE FOR THE RESEARCH PROPOSAL

    The research proposal serves a triple goal, namely explaining why there is a for your need research, detailing why your research is feasible and perspectives your research. As such writing a research proposal is a valuable exercise even if you do not pursue a scientific career. You will have to

  17. PDF Criteria for Evaluating Research Proposals

    its importance to the field of special. education, and its relationship to prior research and literature. 2. o Your evaluation of' the personnel and their ability to assume responsibility in this particular area of inquicyo Secondly, a:re ·th~ facilities adequate for this task? 3. o . The . adequacy . of the research design and evaluation o:f ...

  18. How to Evaluate a Study

    Peer-review, the process by which a study is sent to other researchers in a particular field for their notes and thoughts, is essential in evaluating a study's findings. Since most consumers and members of the media are not well-trained enough to evaluate a study's design and researcher's findings, studies that pass muster with other ...

  19. Evaluating Research Proposals

    Comparing proposals "apples-to-apples" is crucial to establishing which one will best meet your needs. Consider these ideas to help you focus on the details that contribute to a successful survey. Make sure the proposal responds to your objectives. The proposal process begins well before you ask any research firm for quote.

  20. [PDF] The Importance of Research Proposal

    A high quality proposal not only promises success for the project, but also impresses the Thesis Jury about the student's potential as a researcher. A research proposal is intended to convince others that the student has a worthwhile research project and that s/he has the competence and the work-plan to complete it. sydney.edu.au. Save to ...

  21. 6 Benefits of Evaluation

    Evaluation provides the words, the stories, and the statistics to garner more support and broaden and deepen your impact. 6) Produces results you can trust. At the end of the day, you want to know that what you are doing is working. Furthermore, you want to be sure you are making a difference and you need to know how you are changing lives for ...

  22. PDF The Importance of Research Proposal

    II-The importance of research proposal A research proposal is a document of usually ten to fifteen pages that informs others of a proposed piece of research. ... It shows the student's ability to critically evaluate relevant literature information. • 4. It indicates the student's ability to integrate and ...

  23. Research Proposal-Components, Types, Topics, Importance, and Applications

    These proposals outline the research objectives, methodology, timeline, and expected outcomes, with a particular focus on evaluating the feasibility of conducting the research. Feasibility study proposals often involve preliminary data collection or analysis to inform the decision-making process.

  24. Program Evaluation Home

    Evaluation: Purpose is to determine effectiveness of a specific program or model and understand why a program may or may not be working. Goal is to improve programs. Research: Purpose is theory testing and to produce generalizable knowledge. Goal is to contribute to knowledge base. Monitoring: Purpose is to track implementation progress through periodic data collection.