Authors of How to make the most of your Psychology degree

Why is critical thinking important for Psychology students?

Amy Burrell, Daniel Waldeck, and Rachael Leggett – authors of ‘How to make the most of your psychology degree’ – explain what critical thinking is and why it is an essential skill for all Psychology students.

16 September 2022

So, you're off to university to study Psychology? From day one, in seminars, in lectures, attached to almost every assignment, you will encounter the words 'critical thinking' or 'critical evaluation'. But what does that mean?

Showing your critical thinking is not just a box to be ticked on your assignment marksheet: it is a life skill that helps us to really understand and interpret the world around us. Critical thinking is objective and requires you to analyse and evaluate information to form a sound judgement. It is a cornerstone of evidence-based arguments and forming an evidence-based argument is essential in Psychology. That is why we, your tutors, as well as your future employers, want you to develop this skill effectively.

However, despite all the time we spend learning about critical thinking, there are several common issues that come up. Let us start with the causation versus correlation problem.

Causation versus correlation problem

As researchers, when we see a relationship between two variables, we inevitably get excited. It is very easy to think A caused B but, often, life is more complex than that. What might look like a direct relationship, could in fact be coincidence or due to another explanation. Take, for example, that researchers have found cows who are named produce more milk. It would be easy to get carried away with a media headline. When you read the paper , you will realise that the mechanism for producing more milk is not having a name, it is what that represents – i.e. the quality of the human-animal relationship (or, more simply put, treating cows as individuals).

There are also many examples of spurious correlations – the internet is full of interesting oddities like the correlation between divorce rates and consumption of margarine in Maine, USA or that a shortage of pirates has led to global warming . Whilst there might be a relationship between our two variables we need to be thoughtful when interpreting our findings. We can't just take things at face value. We need to try and understand why these relationships exist.

Limited critique

Another common issue is that, where there is critique, this is often limited. Most students do provide some critique, but this is often very generic – for example, 'my sample size is small and/or non-representative'. This is a good start, but we need to see more depth in critique, and this means being more specific – for example, explaining why a small or non-representative sample might be a problem and what this means for your interpretation of findings.

Another common problem is students being dismissive of non-significant results. Remember non-significant is not insignificant! If your research was conducted to a high standard with the large sample size, it could be that you found a genuine lack of a relationship between variables and that might be really important. Listening to the data is essential.

It is important to be aware of wider issues too – for example, the replication crisis . This describes the situation where the findings of many published studies are difficult to reproduce. It is important that findings are replicated to validate these. The replication crisis is a problem for Psychology too; so much so that this presents opportunities for students – i.e. to conduct replication studies!

How do I think critically?

Ok, so we've talked about why critical thinking is important and pointed out some of the challenges, but how do you do critical thinking? We include lots of advice in our book, but there are some key tips to help you get started:

  • Read, read, and read some more – you are a Psychology student now. If you hate reading, you chose the wrong course! Reading helps us to understand how other people critique and this give us the opportunity to reflect on whether we agree or disagree (and why).
  • It's pretty obvious, but go to class. Not only that, engage in your learning. Don't be a passive person sat on their phone. Get in there, get your hands dirty, so to speak. Your sneaky tutors will have built skills development, including critical thinking, into their class activities. Make the most of this to help you learn.
  • Remember critique can be positive or negative – for example, the study could be methodologically robust, but the small sample size could mean it lacks statistical power.
  • It is useful to consider what the researcher can control as a starting point for critiquing research papers (e.g., sample size, methods used, materials used etc.). You can also consider things they can't control (e.g., the time limits or budget for their study, that the population they are drawing their sample from might be small or hard to recruit).
  • When it comes to critique, there is no limit. There is no magic number of critical points you need to make to get a pass or reach a particular grade. The stronger the quality of your critique, the better your grade.

And, finally, remember critical thinking is not just repeating someone else's critique. By all means consider the critique of others but read the papers they are critiquing for yourself and come to your own conclusions .

So, how do we sum up? Well, at the risk of sounding repetitive, there really is only one message we are trying to get across; critical thinking is an essential skill for Psychology students. And for graduates! Once you get into the workplace, you will find you will need to be able to think critically to do your day job. Add to that, being able to think critically at a broader/strategic level (e.g., the replication crisis!) will only ever put you at an advantage, maybe even a frontrunner for that promotion. So, if you crack it during your degree, you will be at a massive advantage in your career.

Our book How to make the most of your psychology degree: study skills, employability, and professional development  is available to purchase here .

Dr Amy Burrell [pictured, left] – formerly Assistant Professor in Forensic Psychology at Coventry University – has considerable experience of tutoring and teaching in Psychology. She is now a Research Fellow in the School of Psychology at the University of Birmingham.

Dr Dan Waldeck [pictured, top right] is an Assistant Professor in Psychology at Coventry University, with extensive experience of teaching research methods and study skills.

Rachael Leggett [pictured, bottom right] is a Lecturer in Forensic Psychology at Coventry University and routinely teaches across forensic topics including study skills and employability.

psychology

Definition of Critical Thinking:

Description:

Critical thinking refers to the intellectual process of analyzing, evaluating, and interpreting information and arguments in a systematic and objective manner. It involves the careful examination of facts, evidence, and reasoning to form rational and well-informed judgments.

Components:

Critical thinking includes several essential components:

  • Analysis: The ability to break down complex information into its constituent parts and examine them systematically.
  • Evaluation: The capacity to assess the credibility, accuracy, and reliability of information and arguments.
  • Inference: The skill to draw logical and reasoned conclusions based on available evidence.
  • Interpretation: The aptitude to comprehend and explain the meaning and significance of information and evidence.
  • Explanation: The capability to clarify and justify one’s own thought processes and reasoning, explicitly stating the underlying assumptions and principles.
  • Self-regulation: The discipline to monitor one’s own thinking, recognizing and challenging biases, prejudices, and assumptions.
  • Open-mindedness: The willingness to consider alternative viewpoints, perspectives, and hypotheses without prejudice or preconceived notions.

Importance:

Critical thinking plays a vital role in various aspects of life, including education, personal and professional relationships, problem-solving, decision-making, and understanding complex issues. It enables individuals to think independently, make informed judgments, evaluate the reliability of information, and develop well-reasoned arguments.

Developing and applying critical thinking skills can lead to numerous benefits, such as:

  • Improved problem-solving abilities and decision-making skills.
  • Enhanced communication and argumentation skills.
  • Strengthened comprehension and interpretation of information.
  • Increased objectivity and rationality in thinking.
  • Heightened creativity and innovation.
  • Reduced vulnerability to manipulation and misinformation.
  • Greater self-awareness and personal growth.
How to Think Critically about Psychological Science
:

.” Because the narrator of the video speaks quite rapidly, you might need to watch the video at least twice (or use on YouTube). .” Note that in Unit 2, we worked on what Halonen refers to as “Practical” critical thinking skills. In this Unit, we will be working on what Halonen refers to as “Methodological” critical thinking skills. .” What It Means To Be Critical [about Psychological Research],” which is more about how to be critical of psychological science than why it’s important to be critical, but Stafford’s article will prepare you for other Assignments in this Unit. arguing either in favor of or against the statement, “Critical thinking is crucial to understanding psychological science.” , your Introduction Paragraph, each of your three Reasons Paragraphs or each of your three Examples Paragraphs, and your Conclusion Paragraph, need to have five sentences: . and make a new Discussion Board post to which you attach your essay, saved as a PDF. Remember to “Attach” your essay’s PDF (don’t embed your file, don’t use the “Document” tool, and don’t use the “File” tool; instead, use the “Attach” tool).

:

.” .” ” Ossola’s indicators of “Check the Label” and “Control the Spin” are like Compound Interest’s “Sensationalized Headlines” indicator; however, Ossola presents a novel indicator “Beware the Animal Study.” .” Warning: John Oliver is a late-night comedian/TV host. Therefore, this video contains adult content, adult language, and extreme irreverence toward a wide swath of people. The video presents numerous examples of bad science indicators; however, if you’d prefer not to watch the video, then please don’t. news reports, each of which reports about a different research study that is characterized by of these indicators of bad science: 4: Assignment #2 and #4 Discussion Board and make a new post of . In your post, do the following:

:

, which is confusing “Correlation with Causation.” .” .” ” .” .” of correlation not proving causation from the videos you watched. One example is the correlation between the amount of ice cream purchased (during each month of the year) and the number of drowning deaths (during each month of the year) not proving that ice cream causes drowning. (friends, roommates, family members, and the like) at why they should not confuse correlation with causation. at three why they should not confuse correlation with causation. ) of correlation not proving causation. and make a new Discussion Board post of in which you do the following:  (text message, email, Zoom, phone call, in-person, etc) you used to teach each of the three persons that correlation cannot be interpreted as causation; (e.g., MG) and ; and that each of the three people told you of correlations that should not be confused with causation (i.e., the screenshots or selfies you took to document your teaching session). (no wider than 500 pixels and no taller than 500 pixels, as explained in the Course How To) and be sure to embed the image, not “Attach” it.

:

.” and read all the posts made in Unit 4: Assignment #2 by the other members (or member) of your Chat Group. of the studies that found and posted (for a total of ). If you are in a Chat Group with only one other member, that the one other member of your Chat Group found and posted (for a total of ). and one of the two other Chat Group members hasn’t yet posted their Unit 4: Assignment #2 — and the due date for Unit 4: Assignment #2 has passed — you can choose two studies from the Chat Group member who has already posted their Unit 4: Assignment #2. and neither of the two other Chat Group members has yet posted their Unit 4: Assignment #2 — and the due date for Unit 4: Assignment #2 has passed — you can choose two studies from students who are not in your Chat Group. , and that other member hasn’t yet posted their Unit 4: Assignment #2 — and the due date for Unit 4: Assignment #2 has passed — you can choose two studies from students who are not in your Chat Group. Discussion Board response posts (replies); each response post should be , and each post should be in response to only one of the two studies. Discussion Board response posts (replies).

:

. .” (which are also called case studies) lie. .” lie. ” lie. .” arguing either in favor of or against the statement, “All scientific evidence is equally strong.” . and do the following:

:

.” . .” he gives for why a result might not replicate. why a result might not replicate, read through xkcd’s cartoon, “ ,” and its explanation. -hacking means, read about the problems with studies that report a link between “a single food” (e.g., eggs) and “a single health outcome” (e.g., heart disease). in producing a result that might not replicate, read Hanna’s (2012) article, “ ” and , read the handout, “ ) versus (and ).” , and , look at . ” text-based Chat using the same chat procedures you used for your Group Chat during Unit 3. , every member of your Chat Group of the following: ?” ” ” and Leyser et al.’s (2017) article, “ ?” by doing the following: Each Chat Group member needs to indicate ONE of the NINE “ ” images. More than one Chat Group member can indicate the same image if that’s how they are feeling, and please refer to each image by its number. : (as you learned about in step a. above); ; and that could lead to the public seeing headlines such as . : , that summarizes your Group Chat in . (under the topic, “How To Save and Attach a Chat Transcript”) . , either this member of the Chat Group needs to make a post on the , and attach the Chat transcript, saved as a PDF, to that Discussion Board post OR this Chat Group member needs to provide this PDF to the Chat Group member who will post the transcript after the Group Chat is over. , that states the name of the assignment (Unit 4: Assignment #6), the full name of your Chat Group, the first and last names of each Chat Group member who participated in the Group Chat, the day (e.g., Sunday) and date of this Group Chat (e.g., June 13), the start and stop time of this Group Chat (e.g., 1pm to 2pm), AND the day, date, start and stop time of your NEXT Group Chat (e.g, Sunday, June 20 from 1pm to 2pm). ” images. More than one Chat Group member can indicate the same image if that’s how they are feeling, and please refer to each image by its number. : The “How Are You Feeling at the of Today’s Group Chat” grid of images differs from the “How Are You Feeling at the of Today’s Group Chat” grid of images.

Congratulations, you have finished Unit 4! Onward to !

Creative Commons License

  • Innovation at WSU
  • Directories
  • Give to WSU
  • Academic Calendar
  • A-Z Directory
  • Calendar of Events
  • Office Hours
  • Policies and Procedures
  • Schedule of Courses
  • Shocker Store
  • Student Webmail
  • Technology HelpDesk
  • Transfer to WSU
  • University Libraries

Unit 04: How to Think Critically about Psychological Science

Unit 4: how to think critically about psychological science.

Unit 4: Assignment #1 (due before 11:59 pm Central on Monday September 6) :

  • Watch Crash Course’s (2014) YouTube, “ Psychological Research . ” Because the narrator of the video speaks quite rapidly, you might need to watch the video at least twice (or use the speed-controller in the settings options on YouTube).
  • Read Halonen’s (1996) article, “ On Critical Thinking [in Psychology] . ” Note that in Unit 2, we worked on what Halonen refers to as “Practical” critical thinking skills. In this Unit, we will be working on what Halonen refers to as “Methodological” critical thinking skills.
  • Read the first page of Dewey’s (2007) chapter, “ Critical Thinking [in Psychology] . ”
  • Read Stafford’s (2014) article, “ What It Means To Be Critical [about Psychological Research] ,” which is more about how to be critical of psychological science than why it’s important to be critical, but Stafford’s article will prepare you for other Assignments in this Unit.
  • You may write either a Reasons/Arguments Essay OR an Examples Essay.
  • Remember to begin by jotting down somewhere your three Reasons/Arguments or your three Examples.
  • Next, you should write your three Reasons/Arguments paragraphs or your three Examples paragraphs.
  • Then, you should write your Thesis Statement.
  • Next, write your Introduction Paragraph, including a hook.
  • The last step is to write your Conclusion Paragraph, in which you restate your Thesis Statement and end with something witty or profound
  • a Topic Sentence;
  • three Supporting Sentences; and
  • a Conclusion Sentence.
  • Save your essay as PDF and name the file YourLastname_CriticalThinkingEssay.pdf .
  • Go to the Unit 4: Assignment #1 Discussion Board and make a new Discussion Board post to which you attach your essay, saved as a PDF. Remember to “Attach” your essay’s PDF (don’t embed your file or use use the “File” tool; instead, use the “Attach” tool).

Unit 4: Assignment #2 (due before 11:59 pm Central on Tuesday September 7) :

  • Read about the first (“Sensationalized Headlines”), second (“Misinterpreted Results”), third (“Conflicts of Interest”), and twelfth (“Non-Peer Reviewed Material”) indicator of bad science in Compound Interest’s (2015) infographic “ A Rough Guide to Spotting Bad Science .”
  • To see examples of the first (“Sensationalized Headlines”) and twelfth (“Non-Peer Reviewed Material”) indicators of bad science, watch Above the Noise’s (2017) YouTube, “ T op 4 Tips To Spot Bad Science Reporting .”
  • Read Ossola’s (2017) article, “ Can You Tell If a Health Story Is Total BS? ” Ossola’s indicators of “Check the Label” and “Control the Spin” are like Compound Interest’s “Sensationalized Headlines” indicator; however, Ossola presents a novel indicator “Beware the Animal Study.”
  • To see examples of these indicators of bad science, watch Last Week Tonight with John Oliver’s (2016) YouTube, “ Scientific Studies .” Warning: John Oliver is a late-night comedian/TV host. Therefore, this video contains adult content, adult language, and extreme irreverence toward a wide swath of people. The video presents numerous examples of bad science indicators; however, if you’d prefer not to watch the video, then please don’t.
  • “Sensationalized Headlines”
  • “Misinterpreted Results”
  • “Non-Peer Reviewed Material”
  • “Beware the Animal Study”
  • “Conflicts of Interest”
  • Be sure to find three different news reports, each of which reports about a different research study , rather than three news reports all of which report about the same research study.
  • Describe each of the three news reports, preferably each in its own paragraph.
  • Identify which indicator of bad science characterizes each news report.
  • Provide for each news report either its URL (using the technique you learned from the Course How To ) or, if a video, its YouTube or Vimeo (using the technique you learned from the Course How To ).

Unit 4: Assignment #3 (due before 11:59 pm Central on Wednesday September 8) :

  • Watch TEDxDelft’s (2012) YouTube, “ The Danger of Mixing Up Causality and Correlation . ”
  • Watch PsychU’s (2015) YouTube, “ Correlation vs. Causation – PSY 101 . ”
  • Because PsychU momentarily confuses the term “hypothesis” with the term “theory,” watch PBS’s (2015) YouTube “ Fact vs. Theory vs. Hypothesis vs. Law … Explained! ”
  • Make sure you know the meanings of, and differences among, the four terms: Fact, Theory, Hypothesis, and Law. Not only will you will need to know these terms and their differences throughout the rest of this course, but everyone should know these terms and their differences.
  • Watch AsapScience’s (2017) YouTube, “ This ≠ That .”
  • Watch Khan Academy’s (2011) YouTube, “ Correlation and Causality . ”
  • Jot down at least six examples of correlation not proving causation from the videos you watched. One example is the correlation between the amount of ice cream purchased (during each month of the year) and the number of drowning deaths (during each month of the year) not proving that ice cream causes drowning.
  • Make sure you understand that two variables (e.g., ice cream purchases per month and drowning deaths per month) might both be caused by another variable (e.g., season of the year). That other variable is often called a confounding variable.
  • Make sure you understand that the correlation between two variables (e.g., pool drownings per year and Nicholas Cage films per year) might simply be due to coincidence.
  • Make sure you understand that rather than one variable (e.g., skipping breakfast) causing another variable (e.g., obesity), the causation might be reversed.
  • When you are teaching each person, provide examples of correlations that do not prove causation, using the examples you saw in the videos.
  • To make sure that each of the three people learned why correlation should not be interpreted as causation, ask each person to tell you another example (an example that you did not tell them) of correlation not proving causation.
  • Also teach each of the three persons the difference between hypothesis and theory.
  • describe how you taught the three persons that correlation cannot be interpreted as causation;
  • state each of the three persons’ initials (e.g., MG) and their approximate age; and
  • report the examples each person told you of correlations that should not be confused with causation.

Unit 4: Assignment #4 (due before 11:59 pm Central on Thursday September 9) :

  • To cement your learning about critical thinking and psychological science, read Tesler’s (2016) article, “ Can You Believe It? Seven Questions to Ask About Any Scientific Claim .”
  • Go to the Unit 4: Assignment #2 and #4 Discussion Board and read all the posts made in Unit 4: Assignment #2 by the other members (or member) of your Chat Group.
  • If you are in a Chat Group with two other members, and one of the two other Chat Group members hasn’t yet posted their Unit 4: Assignment #2 — and the due date for Unit 4: Assignment #2 has passed — you can choose two studies from the Chat Group member who has already posted their Unit 4: Assignment #2.
  • If you are in a Chat Group with only one other member , and that other member hasn’t yet posted their Unit 4: Assignment #2 — and the due date for Unit 4: Assignment #2 has passed — you can choose two studies from students who are not in your Chat Group.
  • Now, pretend that instead of your Chat Group member(s) finding and posting these two studies on our class Discussion Board as examples of bad science, two relatives or acquaintances of yours posted the two studies on Facebook, Twitter, or other social media as recommendations to their friends and followers (of good science).
  • You are required to make two separate Discussion Board response posts (replies); each response post should be at least 200 words, and each post should be in response to only one of the two studies.
  • Even if you are replying to the same student, you must make two separate Discussion Board posts.
  • Each of your two response posts must incorporate what you learned from Tesler’s (2016) article.
  • Address your two responses not to your Chat Group Member(s) but instead to fictitious people, such as “Dear Aunt Bessie” or “Hey, Freshman Roommate.”

Unit 4: Assignment #5 (due before 11:59 pm Central on Saturday September 11) :

  • download (to your own computer) and save The Logic of Science’s (no date) Hierarchy of Scientific Evidence graphic (PDF format) ; (Text-Only Word format) , and
  • read Brunning’s (2015) article, “ A Rough Guide to Types of Scientific Evidence .”
  • Then, read about some of the most famous psychology case reports in Jarrett’s (2015) article, “ Psychology’s Greatest Case Studies – Digested .”
  • Think about why it was important for the various scientists who reported these case studies to report these case studies.
  • Then, read about an example of a randomized controlled study in Fradera’s (2017) article, “ How Much Are Readers Misled by Headlines that Imply Correlational Findings Are Causal? ”
  • Think about why it was important for these scientists to conduct a randomized controlled study on this topic.
  • Then, to see an example of a journal article reporting a meta-analysis, read the abstract of Hyde and Lynn’s (1988) article, “ Gender Differences in Verbal Ability: A Meta-Analysis .”
  • Think about why it was important for these scientists to conduct a meta-analysis on this topic.
  • Begin by jotting down your three Reasons/Arguments or your three Examples.
  • Then, write your three Reasons/Arguments Paragraphs or your three Examples Paragraphs.
  • For each of your three Reasons/Arguments Paragraphs or each of your three Examples Paragraphs, write a Topic Sentence, three Supporting Sentences, and a Conclusion Sentence.
  • Then, write your essay’s Thesis Statement.
  • Then, write your essay’s Introduction Paragraph, including a hook.
  • Then, write your essay’s Conclusion Paragraph, in which you restate your Thesis Statement and end with something witty or profound.
  • Remember that all five of your paragraphs, your Introduction Paragraph, each of your three Reasons/Arguments Paragraphs or your three Examples Paragraphs, and your Conclusion Paragraph, must have a Topic Sentence, three Supporting Sentences, and a Conclusion Sentence.
  • Save your essay as a PDF and name the file YourLastname_EvidenceEssay.pdf .
  • Make a new Discussion Board post to which you attach your essay, saved as a PDF. This is described in the Course How To  (under the topic, “How to Save and Attach a Group Text Chat Transcript”).

Unit 4: Assignment #6 (due before 11:59 pm Central on Monday September 13) :

  • To learn what replication means in research, read CK-12’s (no date) article, “ Replication .”
  • To get a quick overview of replication courtesy of the NBC comedy TV show, “The Good Place,” read Harnett’s 2019 tweet .
  • Be sure to notice in the excerpt from Dewey’s article the methodological reasons he gives for why a result might not replicate.
  • Be sure to read the entire cartoon, frame by frame.
  • Although it might seem repetitive, the point is to read closely through all 20 of the frames that report the scientists’ results.
  • Then read the final page’s “Explanation.”
  • Now that you understand what p-hacking means, read Hobbes’s (2019) tweet about the problems with studies that report a link between “a single food” (e.g., eggs) and “a single health outcome” (e.g., heart disease).
  • To ensure that you understand the role of chance and statistical probability in producing a result that might not replicate, read Hanna’s (2012) article, “ Why Is Replication So Important? ”
  • To understand the difference between the term replication and reproduction , read the handout, “ Replication (and Replicability) versus Reproduction (and Reproducibility) . ”
  • To understand the role of false positives in research, watch Veritasium’s (2016) YouTube, “ Is Most Published Research Wrong? ”
  • This YouTube is a more complex video than some of the others you’ve watched in this Unit.
  • Try to understand as much as you can. You’ll have a chance to discuss what you don’t understand in this video with your Chat Group.
  • If your last name comes first alphabetically in your Chat Group, watch NOVA’s (2017) video “ What Makes Science True ? ”
  • If your last name comes last alphabetically in your Chat Group, read all the way through Naro’s (2016) comic strip, “ Why Can’t Anyone Replicate the Scientific Studies from Those Eye-Grabbing Headlines? ”
  • If your last name comes neither first nor last in your Chat Group (or if you are in a Chat Group with only two members), read Carroll’s (2017) article, “ Science Needs a Solution for the Temptation of Positive Results ” and Leyser et al.’s (2017) article, “ The Science ‘Reproducibility Crisis’ – and What Can Be Done about It? ”
  • Begin by reviewing the articles and videos that everyone in your Chat Group read and watched.
  • Next, take turns telling the other member(s) of your Chat Group about the article, video, or comic strip that individual Chat Group members read.
  • identify one statistical (probability) reason ;
  • identify one researcher motivation ; and
  • one news media motivation that could lead to the public seeing headlines such as these two conflicting headlines .
  • propose one solution to the statistical (probability) reason you identified;
  • propose one solution to the researcher motivation you identified; and
  • propose one solution to the the news media motivation you identified.
  • Nominate one member of your Chat Group (who participated in the Chat) to make a post on the Unit 4: Assignment #6 Discussion Board  that summarizes your Group Chat in at least 200 words.
  • Then, this member of the Chat Group needs to make a post on the Unit 4: Assignment #6 Discussion Board and attach the Chat transcript, saved as a PDF, to that Discussion Board post.
  • Nominate a third member of your Chat Group (who also participated in the Chat) to make another post on the Unit 4: Assignment #6 Discussion Board  that states the name of your Chat Group, the names of the Chat Group members who participated the Chat, the date of your Chat, and the start and stop time of your Group Chat.
  • If only two persons participated in the Chat, then one of those two persons needs to do two of the above three tasks.
  • Before ending the Group Chat, arrange the date and time for the Group Chat you will need to hold during the next Unit (Unit 5: Assignment #6).
  • All members of the Chat Group must record a typical Unit entry in their own Course Journal for Unit 4.

Congratulations, you have finished Unit 4! Onward to Unit 5 !

Open-Access Active-Learning Research Methods Course by Morton Ann Gernsbacher, PhD is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License . The materials have been modified to add various ADA-compliant accessibility features, in some cases including alternative text-only versions. Permissions beyond the scope of this license may be available at http://www.gernsbacherlab.org .

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.17(1); Spring 2018

Understanding the Complex Relationship between Critical Thinking and Science Reasoning among Undergraduate Thesis Writers

Jason e. dowd.

† Department of Biology, Duke University, Durham, NC 27708

Robert J. Thompson, Jr.

‡ Department of Psychology and Neuroscience, Duke University, Durham, NC 27708

Leslie A. Schiff

§ Department of Microbiology and Immunology, University of Minnesota, Minneapolis, MN 55455

Julie A. Reynolds

Associated data.

This study empirically examines the relationship between students’ critical-thinking skills and scientific reasoning as reflected in undergraduate thesis writing in biology. Writing offers a unique window into studying this relationship, and the findings raise potential implications for instruction.

Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students’ development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in biology at two universities, we examine how scientific reasoning exhibited in writing (assessed using the Biology Thesis Assessment Protocol) relates to general and specific critical-thinking skills (assessed using the California Critical Thinking Skills Test), and we consider implications for instruction. We find that scientific reasoning in writing is strongly related to inference , while other aspects of science reasoning that emerge in writing (epistemological considerations, writing conventions, etc.) are not significantly related to critical-thinking skills. Science reasoning in writing is not merely a proxy for critical thinking. In linking features of students’ writing to their critical-thinking skills, this study 1) provides a bridge to prior work suggesting that engagement in science writing enhances critical thinking and 2) serves as a foundational step for subsequently determining whether instruction focused explicitly on developing critical-thinking skills (particularly inference ) can actually improve students’ scientific reasoning in their writing.

INTRODUCTION

Critical-thinking and scientific reasoning skills are core learning objectives of science education for all students, regardless of whether or not they intend to pursue a career in science or engineering. Consistent with the view of learning as construction of understanding and meaning ( National Research Council, 2000 ), the pedagogical practice of writing has been found to be effective not only in fostering the development of students’ conceptual and procedural knowledge ( Gerdeman et al. , 2007 ) and communication skills ( Clase et al. , 2010 ), but also scientific reasoning ( Reynolds et al. , 2012 ) and critical-thinking skills ( Quitadamo and Kurtz, 2007 ).

Critical thinking and scientific reasoning are similar but different constructs that include various types of higher-order cognitive processes, metacognitive strategies, and dispositions involved in making meaning of information. Critical thinking is generally understood as the broader construct ( Holyoak and Morrison, 2005 ), comprising an array of cognitive processes and dispostions that are drawn upon differentially in everyday life and across domains of inquiry such as the natural sciences, social sciences, and humanities. Scientific reasoning, then, may be interpreted as the subset of critical-thinking skills (cognitive and metacognitive processes and dispositions) that 1) are involved in making meaning of information in scientific domains and 2) support the epistemological commitment to scientific methodology and paradigm(s).

Although there has been an enduring focus in higher education on promoting critical thinking and reasoning as general or “transferable” skills, research evidence provides increasing support for the view that reasoning and critical thinking are also situational or domain specific ( Beyer et al. , 2013 ). Some researchers, such as Lawson (2010) , present frameworks in which science reasoning is characterized explicitly in terms of critical-thinking skills. There are, however, limited coherent frameworks and empirical evidence regarding either the general or domain-specific interrelationships of scientific reasoning, as it is most broadly defined, and critical-thinking skills.

The Vision and Change in Undergraduate Biology Education Initiative provides a framework for thinking about these constructs and their interrelationship in the context of the core competencies and disciplinary practice they describe ( American Association for the Advancement of Science, 2011 ). These learning objectives aim for undergraduates to “understand the process of science, the interdisciplinary nature of the new biology and how science is closely integrated within society; be competent in communication and collaboration; have quantitative competency and a basic ability to interpret data; and have some experience with modeling, simulation and computational and systems level approaches as well as with using large databases” ( Woodin et al. , 2010 , pp. 71–72). This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, “understanding the process of science” requires students to engage in (and be metacognitive about) scientific reasoning, and having the “ability to interpret data” requires critical-thinking skills. To help students better achieve these core competencies, we must better understand the interrelationships of their composite parts. Thus, the next step is to determine which specific critical-thinking skills are drawn upon when students engage in science reasoning in general and with regard to the particular scientific domain being studied. Such a determination could be applied to improve science education for both majors and nonmajors through pedagogical approaches that foster critical-thinking skills that are most relevant to science reasoning.

Writing affords one of the most effective means for making thinking visible ( Reynolds et al. , 2012 ) and learning how to “think like” and “write like” disciplinary experts ( Meizlish et al. , 2013 ). As a result, student writing affords the opportunities to both foster and examine the interrelationship of scientific reasoning and critical-thinking skills within and across disciplinary contexts. The purpose of this study was to better understand the relationship between students’ critical-thinking skills and scientific reasoning skills as reflected in the genre of undergraduate thesis writing in biology departments at two research universities, the University of Minnesota and Duke University.

In the following subsections, we discuss in greater detail the constructs of scientific reasoning and critical thinking, as well as the assessment of scientific reasoning in students’ thesis writing. In subsequent sections, we discuss our study design, findings, and the implications for enhancing educational practices.

Critical Thinking

The advances in cognitive science in the 21st century have increased our understanding of the mental processes involved in thinking and reasoning, as well as memory, learning, and problem solving. Critical thinking is understood to include both a cognitive dimension and a disposition dimension (e.g., reflective thinking) and is defined as “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considera­tions upon which that judgment is based” ( Facione, 1990, p. 3 ). Although various other definitions of critical thinking have been proposed, researchers have generally coalesced on this consensus: expert view ( Blattner and Frazier, 2002 ; Condon and Kelly-Riley, 2004 ; Bissell and Lemons, 2006 ; Quitadamo and Kurtz, 2007 ) and the corresponding measures of critical-­thinking skills ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ).

Both the cognitive skills and dispositional components of critical thinking have been recognized as important to science education ( Quitadamo and Kurtz, 2007 ). Empirical research demonstrates that specific pedagogical practices in science courses are effective in fostering students’ critical-thinking skills. Quitadamo and Kurtz (2007) found that students who engaged in a laboratory writing component in the context of a general education biology course significantly improved their overall critical-thinking skills (and their analytical and inference skills, in particular), whereas students engaged in a traditional quiz-based laboratory did not improve their critical-thinking skills. In related work, Quitadamo et al. (2008) found that a community-based inquiry experience, involving inquiry, writing, research, and analysis, was associated with improved critical thinking in a biology course for nonmajors, compared with traditionally taught sections. In both studies, students who exhibited stronger presemester critical-thinking skills exhibited stronger gains, suggesting that “students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills” ( Quitadamo and Kurtz, 2007 , p. 151).

Recently, Stephenson and Sadler-McKnight (2016) found that first-year general chemistry students who engaged in a science writing heuristic laboratory, which is an inquiry-based, writing-to-learn approach to instruction ( Hand and Keys, 1999 ), had significantly greater gains in total critical-thinking scores than students who received traditional laboratory instruction. Each of the four components—inquiry, writing, collaboration, and reflection—have been linked to critical thinking ( Stephenson and Sadler-McKnight, 2016 ). Like the other studies, this work highlights the value of targeting critical-thinking skills and the effectiveness of an inquiry-based, writing-to-learn approach to enhance critical thinking. Across studies, authors advocate adopting critical thinking as the course framework ( Pukkila, 2004 ) and developing explicit examples of how critical thinking relates to the scientific method ( Miri et al. , 2007 ).

In these examples, the important connection between writing and critical thinking is highlighted by the fact that each intervention involves the incorporation of writing into science, technology, engineering, and mathematics education (either alone or in combination with other pedagogical practices). However, critical-thinking skills are not always the primary learning outcome; in some contexts, scientific reasoning is the primary outcome that is assessed.

Scientific Reasoning

Scientific reasoning is a complex process that is broadly defined as “the skills involved in inquiry, experimentation, evidence evaluation, and inference that are done in the service of conceptual change or scientific understanding” ( Zimmerman, 2007 , p. 172). Scientific reasoning is understood to include both conceptual knowledge and the cognitive processes involved with generation of hypotheses (i.e., inductive processes involved in the generation of hypotheses and the deductive processes used in the testing of hypotheses), experimentation strategies, and evidence evaluation strategies. These dimensions are interrelated, in that “experimentation and inference strategies are selected based on prior conceptual knowledge of the domain” ( Zimmerman, 2000 , p. 139). Furthermore, conceptual and procedural knowledge and cognitive process dimensions can be general and domain specific (or discipline specific).

With regard to conceptual knowledge, attention has been focused on the acquisition of core methodological concepts fundamental to scientists’ causal reasoning and metacognitive distancing (or decontextualized thinking), which is the ability to reason independently of prior knowledge or beliefs ( Greenhoot et al. , 2004 ). The latter involves what Kuhn and Dean (2004) refer to as the coordination of theory and evidence, which requires that one question existing theories (i.e., prior knowledge and beliefs), seek contradictory evidence, eliminate alternative explanations, and revise one’s prior beliefs in the face of contradictory evidence. Kuhn and colleagues (2008) further elaborate that scientific thinking requires “a mature understanding of the epistemological foundations of science, recognizing scientific knowledge as constructed by humans rather than simply discovered in the world,” and “the ability to engage in skilled argumentation in the scientific domain, with an appreciation of argumentation as entailing the coordination of theory and evidence” ( Kuhn et al. , 2008 , p. 435). “This approach to scientific reasoning not only highlights the skills of generating and evaluating evidence-based inferences, but also encompasses epistemological appreciation of the functions of evidence and theory” ( Ding et al. , 2016 , p. 616). Evaluating evidence-based inferences involves epistemic cognition, which Moshman (2015) defines as the subset of metacognition that is concerned with justification, truth, and associated forms of reasoning. Epistemic cognition is both general and domain specific (or discipline specific; Moshman, 2015 ).

There is empirical support for the contributions of both prior knowledge and an understanding of the epistemological foundations of science to scientific reasoning. In a study of undergraduate science students, advanced scientific reasoning was most often accompanied by accurate prior knowledge as well as sophisticated epistemological commitments; additionally, for students who had comparable levels of prior knowledge, skillful reasoning was associated with a strong epistemological commitment to the consistency of theory with evidence ( Zeineddin and Abd-El-Khalick, 2010 ). These findings highlight the importance of the need for instructional activities that intentionally help learners develop sophisticated epistemological commitments focused on the nature of knowledge and the role of evidence in supporting knowledge claims ( Zeineddin and Abd-El-Khalick, 2010 ).

Scientific Reasoning in Students’ Thesis Writing

Pedagogical approaches that incorporate writing have also focused on enhancing scientific reasoning. Many rubrics have been developed to assess aspects of scientific reasoning in written artifacts. For example, Timmerman and colleagues (2011) , in the course of describing their own rubric for assessing scientific reasoning, highlight several examples of scientific reasoning assessment criteria ( Haaga, 1993 ; Tariq et al. , 1998 ; Topping et al. , 2000 ; Kelly and Takao, 2002 ; Halonen et al. , 2003 ; Willison and O’Regan, 2007 ).

At both the University of Minnesota and Duke University, we have focused on the genre of the undergraduate honors thesis as the rhetorical context in which to study and improve students’ scientific reasoning and writing. We view the process of writing an undergraduate honors thesis as a form of professional development in the sciences (i.e., a way of engaging students in the practices of a community of discourse). We have found that structured courses designed to scaffold the thesis-­writing process and promote metacognition can improve writing and reasoning skills in biology, chemistry, and economics ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In the context of this prior work, we have defined scientific reasoning in writing as the emergent, underlying construct measured across distinct aspects of students’ written discussion of independent research in their undergraduate theses.

The Biology Thesis Assessment Protocol (BioTAP) was developed at Duke University as a tool for systematically guiding students and faculty through a “draft–feedback–revision” writing process, modeled after professional scientific peer-review processes ( Reynolds et al. , 2009 ). BioTAP includes activities and worksheets that allow students to engage in critical peer review and provides detailed descriptions, presented as rubrics, of the questions (i.e., dimensions, shown in Table 1 ) upon which such review should focus. Nine rubric dimensions focus on communication to the broader scientific community, and four rubric dimensions focus on the accuracy and appropriateness of the research. These rubric dimensions provide criteria by which the thesis is assessed, and therefore allow BioTAP to be used as an assessment tool as well as a teaching resource ( Reynolds et al. , 2009 ). Full details are available at www.science-writing.org/biotap.html .

Theses assessment protocol dimensions

In previous work, we have used BioTAP to quantitatively assess students’ undergraduate honors theses and explore the relationship between thesis-writing courses (or specific interventions within the courses) and the strength of students’ science reasoning in writing across different science disciplines: biology ( Reynolds and Thompson, 2011 ); chemistry ( Dowd et al. , 2015b ); and economics ( Dowd et al. , 2015a ). We have focused exclusively on the nine dimensions related to reasoning and writing (questions 1–9), as the other four dimensions (questions 10–13) require topic-specific expertise and are intended to be used by the student’s thesis supervisor.

Beyond considering individual dimensions, we have investigated whether meaningful constructs underlie students’ thesis scores. We conducted exploratory factor analysis of students’ theses in biology, economics, and chemistry and found one dominant underlying factor in each discipline; we termed the factor “scientific reasoning in writing” ( Dowd et al. , 2015a , b , 2016 ). That is, each of the nine dimensions could be understood as reflecting, in different ways and to different degrees, the construct of scientific reasoning in writing. The findings indicated evidence of both general and discipline-specific components to scientific reasoning in writing that relate to epistemic beliefs and paradigms, in keeping with broader ideas about science reasoning discussed earlier. Specifically, scientific reasoning in writing is more strongly associated with formulating a compelling argument for the significance of the research in the context of current literature in biology, making meaning regarding the implications of the findings in chemistry, and providing an organizational framework for interpreting the thesis in economics. We suggested that instruction, whether occurring in writing studios or in writing courses to facilitate thesis preparation, should attend to both components.

Research Question and Study Design

The genre of thesis writing combines the pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-­McKnight, 2016 ). However, there is no empirical evidence regarding the general or domain-specific interrelationships of scientific reasoning and critical-thinking skills, particularly in the rhetorical context of the undergraduate thesis. The BioTAP studies discussed earlier indicate that the rubric-based assessment produces evidence of scientific reasoning in the undergraduate thesis, but it was not designed to foster or measure critical thinking. The current study was undertaken to address the research question: How are students’ critical-thinking skills related to scientific reasoning as reflected in the genre of undergraduate thesis writing in biology? Determining these interrelationships could guide efforts to enhance students’ scientific reasoning and writing skills through focusing instruction on specific critical-thinking skills as well as disciplinary conventions.

To address this research question, we focused on undergraduate thesis writers in biology courses at two institutions, Duke University and the University of Minnesota, and examined the extent to which students’ scientific reasoning in writing, assessed in the undergraduate thesis using BioTAP, corresponds to students’ critical-thinking skills, assessed using the California Critical Thinking Skills Test (CCTST; August, 2016 ).

Study Sample

The study sample was composed of students enrolled in courses designed to scaffold the thesis-writing process in the Department of Biology at Duke University and the College of Biological Sciences at the University of Minnesota. Both courses complement students’ individual work with research advisors. The course is required for thesis writers at the University of Minnesota and optional for writers at Duke University. Not all students are required to complete a thesis, though it is required for students to graduate with honors; at the University of Minnesota, such students are enrolled in an honors program within the college. In total, 28 students were enrolled in the course at Duke University and 44 students were enrolled in the course at the University of Minnesota. Of those students, two students did not consent to participate in the study; additionally, five students did not validly complete the CCTST (i.e., attempted fewer than 60% of items or completed the test in less than 15 minutes). Thus, our overall rate of valid participation is 90%, with 27 students from Duke University and 38 students from the University of Minnesota. We found no statistically significant differences in thesis assessment between students with valid CCTST scores and invalid CCTST scores. Therefore, we focus on the 65 students who consented to participate and for whom we have complete and valid data in most of this study. Additionally, in asking students for their consent to participate, we allowed them to choose whether to provide or decline access to academic and demographic background data. Of the 65 students who consented to participate, 52 students granted access to such data. Therefore, for additional analyses involving academic and background data, we focus on the 52 students who consented. We note that the 13 students who participated but declined to share additional data performed slightly lower on the CCTST than the 52 others (perhaps suggesting that they differ by other measures, but we cannot determine this with certainty). Among the 52 students, 60% identified as female and 10% identified as being from underrepresented ethnicities.

In both courses, students completed the CCTST online, either in class or on their own, late in the Spring 2016 semester. This is the same assessment that was used in prior studies of critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). It is “an objective measure of the core reasoning skills needed for reflective decision making concerning what to believe or what to do” ( Insight Assessment, 2016a ). In the test, students are asked to read and consider information as they answer multiple-choice questions. The questions are intended to be appropriate for all users, so there is no expectation of prior disciplinary knowledge in biology (or any other subject). Although actual test items are protected, sample items are available on the Insight Assessment website ( Insight Assessment, 2016b ). We have included one sample item in the Supplemental Material.

The CCTST is based on a consensus definition of critical thinking, measures cognitive and metacognitive skills associated with critical thinking, and has been evaluated for validity and reliability at the college level ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ). In addition to providing overall critical-thinking score, the CCTST assesses seven dimensions of critical thinking: analysis, interpretation, inference, evaluation, explanation, induction, and deduction. Scores on each dimension are calculated based on students’ performance on items related to that dimension. Analysis focuses on identifying assumptions, reasons, and claims and examining how they interact to form arguments. Interpretation, related to analysis, focuses on determining the precise meaning and significance of information. Inference focuses on drawing conclusions from reasons and evidence. Evaluation focuses on assessing the credibility of sources of information and claims they make. Explanation, related to evaluation, focuses on describing the evidence, assumptions, or rationale for beliefs and conclusions. Induction focuses on drawing inferences about what is probably true based on evidence. Deduction focuses on drawing conclusions about what must be true when the context completely determines the outcome. These are not independent dimensions; the fact that they are related supports their collective interpretation as critical thinking. Together, the CCTST dimensions provide a basis for evaluating students’ overall strength in using reasoning to form reflective judgments about what to believe or what to do ( August, 2016 ). Each of the seven dimensions and the overall CCTST score are measured on a scale of 0–100, where higher scores indicate superior performance. Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and below) skills.

Scientific Reasoning in Writing

At the end of the semester, students’ final, submitted undergraduate theses were assessed using BioTAP, which consists of nine rubric dimensions that focus on communication to the broader scientific community and four additional dimensions that focus on the exhibition of topic-specific expertise ( Reynolds et al. , 2009 ). These dimensions, framed as questions, are displayed in Table 1 .

Student theses were assessed on questions 1–9 of BioTAP using the same procedures described in previous studies ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In this study, six raters were trained in the valid, reliable use of BioTAP rubrics. Each dimension was rated on a five-point scale: 1 indicates the dimension is missing, incomplete, or below acceptable standards; 3 indicates that the dimension is adequate but not exhibiting mastery; and 5 indicates that the dimension is excellent and exhibits mastery (intermediate ratings of 2 and 4 are appropriate when different parts of the thesis make a single category challenging). After training, two raters independently assessed each thesis and then discussed their independent ratings with one another to form a consensus rating. The consensus score is not an average score, but rather an agreed-upon, discussion-based score. On a five-point scale, raters independently assessed dimensions to be within 1 point of each other 82.4% of the time before discussion and formed consensus ratings 100% of the time after discussion.

In this study, we consider both categorical (mastery/nonmastery, where a score of 5 corresponds to mastery) and numerical treatments of individual BioTAP scores to better relate the manifestation of critical thinking in BioTAP assessment to all of the prior studies. For comprehensive/cumulative measures of BioTAP, we focus on the partial sum of questions 1–5, as these questions relate to higher-order scientific reasoning (whereas questions 6–9 relate to mid- and lower-order writing mechanics [ Reynolds et al. , 2009 ]), and the factor scores (i.e., numerical representations of the extent to which each student exhibits the underlying factor), which are calculated from the factor loadings published by Dowd et al. (2016) . We do not focus on questions 6–9 individually in statistical analyses, because we do not expect critical-thinking skills to relate to mid- and lower-order writing skills.

The final, submitted thesis reflects the student’s writing, the student’s scientific reasoning, the quality of feedback provided to the student by peers and mentors, and the student’s ability to incorporate that feedback into his or her work. Therefore, our assessment is not the same as an assessment of unpolished, unrevised samples of students’ written work. While one might imagine that such an unpolished sample may be more strongly correlated with critical-thinking skills measured by the CCTST, we argue that the complete, submitted thesis, assessed using BioTAP, is ultimately a more appropriate reflection of how students exhibit science reasoning in the scientific community.

Statistical Analyses

We took several steps to analyze the collected data. First, to provide context for subsequent interpretations, we generated descriptive statistics for the CCTST scores of the participants based on the norms for undergraduate CCTST test takers. To determine the strength of relationships among CCTST dimensions (including overall score) and the BioTAP dimensions, partial-sum score (questions 1–5), and factor score, we calculated Pearson’s correlations for each pair of measures. To examine whether falling on one side of the nonmastery/mastery threshold (as opposed to a linear scale of performance) was related to critical thinking, we grouped BioTAP dimensions into categories (mastery/nonmastery) and conducted Student’s t tests to compare the means scores of the two groups on each of the seven dimensions and overall score of the CCTST. Finally, for the strongest relationship that emerged, we included additional academic and background variables as covariates in multiple linear-regression analysis to explore questions about how much observed relationships between critical-thinking skills and science reasoning in writing might be explained by variation in these other factors.

Although BioTAP scores represent discreet, ordinal bins, the five-point scale is intended to capture an underlying continuous construct (from inadequate to exhibiting mastery). It has been argued that five categories is an appropriate cutoff for treating ordinal variables as pseudo-continuous ( Rhemtulla et al. , 2012 )—and therefore using continuous-variable statistical methods (e.g., Pearson’s correlations)—as long as the underlying assumption that ordinal scores are linearly distributed is valid. Although we have no way to statistically test this assumption, we interpret adequate scores to be approximately halfway between inadequate and mastery scores, resulting in a linear scale. In part because this assumption is subject to disagreement, we also consider and interpret a categorical (mastery/nonmastery) treatment of BioTAP variables.

We corrected for multiple comparisons using the Holm-Bonferroni method ( Holm, 1979 ). At the most general level, where we consider the single, comprehensive measures for BioTAP (partial-sum and factor score) and the CCTST (overall score), there is no need to correct for multiple comparisons, because the multiple, individual dimensions are collapsed into single dimensions. When we considered individual CCTST dimensions in relation to comprehensive measures for BioTAP, we accounted for seven comparisons; similarly, when we considered individual dimensions of BioTAP in relation to overall CCTST score, we accounted for five comparisons. When all seven CCTST and five BioTAP dimensions were examined individually and without prior knowledge, we accounted for 35 comparisons; such a rigorous threshold is likely to reject weak and moderate relationships, but it is appropriate if there are no specific pre-existing hypotheses. All p values are presented in tables for complete transparency, and we carefully consider the implications of our interpretation of these data in the Discussion section.

CCTST scores for students in this sample ranged from the 39th to 99th percentile of the general population of undergraduate CCTST test takers (mean percentile = 84.3, median = 85th percentile; Table 2 ); these percentiles reflect overall scores that range from moderate to superior. Scores on individual dimensions and overall scores were sufficiently normal and far enough from the ceiling of the scale to justify subsequent statistical analyses.

Descriptive statistics of CCTST dimensions a

MinimumMeanMedianMaximum
Analysis7088.690100
Interpretation7489.787100
Inference7887.989100
Evaluation6383.684100
Explanation6184.487100
Induction7487.48797
Deduction7186.48797
Overall73868597

a Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and lower) skills.

The Pearson’s correlations between students’ cumulative scores on BioTAP (the factor score based on loadings published by Dowd et al. , 2016 , and the partial sum of scores on questions 1–5) and students’ overall scores on the CCTST are presented in Table 3 . We found that the partial-sum measure of BioTAP was significantly related to the overall measure of critical thinking ( r = 0.27, p = 0.03), while the BioTAP factor score was marginally related to overall CCTST ( r = 0.24, p = 0.05). When we looked at relationships between comprehensive BioTAP measures and scores for individual dimensions of the CCTST ( Table 3 ), we found significant positive correlations between the both BioTAP partial-sum and factor scores and CCTST inference ( r = 0.45, p < 0.001, and r = 0.41, p < 0.001, respectively). Although some other relationships have p values below 0.05 (e.g., the correlations between BioTAP partial-sum scores and CCTST induction and interpretation scores), they are not significant when we correct for multiple comparisons.

Correlations between dimensions of CCTST and dimensions of BioTAP a

a In each cell, the top number is the correlation, and the bottom, italicized number is the associated p value. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

b This is the partial sum of BioTAP scores on questions 1–5.

c This is the factor score calculated from factor loadings published by Dowd et al. (2016) .

When we expanded comparisons to include all 35 potential correlations among individual BioTAP and CCTST dimensions—and, accordingly, corrected for 35 comparisons—we did not find any additional statistically significant relationships. The Pearson’s correlations between students’ scores on each dimension of BioTAP and students’ scores on each dimension of the CCTST range from −0.11 to 0.35 ( Table 3 ); although the relationship between discussion of implications (BioTAP question 5) and inference appears to be relatively large ( r = 0.35), it is not significant ( p = 0.005; the Holm-Bonferroni cutoff is 0.00143). We found no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions (unpublished data), regardless of whether we correct for multiple comparisons.

The results of Student’s t tests comparing scores on each dimension of the CCTST of students who exhibit mastery with those of students who do not exhibit mastery on each dimension of BioTAP are presented in Table 4 . Focusing first on the overall CCTST scores, we found that the difference between those who exhibit mastery and those who do not in discussing implications of results (BioTAP question 5) is statistically significant ( t = 2.73, p = 0.008, d = 0.71). When we expanded t tests to include all 35 comparisons—and, like above, corrected for 35 comparisons—we found a significant difference in inference scores between students who exhibit mastery on question 5 and students who do not ( t = 3.41, p = 0.0012, d = 0.88), as well as a marginally significant difference in these students’ induction scores ( t = 3.26, p = 0.0018, d = 0.84; the Holm-Bonferroni cutoff is p = 0.00147). Cohen’s d effect sizes, which reveal the strength of the differences for statistically significant relationships, range from 0.71 to 0.88.

The t statistics and effect sizes of differences in ­dimensions of CCTST across dimensions of BioTAP a

a In each cell, the top number is the t statistic for each comparison, and the middle, italicized number is the associated p value. The bottom number is the effect size. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

Finally, we more closely examined the strongest relationship that we observed, which was between the CCTST dimension of inference and the BioTAP partial-sum composite score (shown in Table 3 ), using multiple regression analysis ( Table 5 ). Focusing on the 52 students for whom we have background information, we looked at the simple relationship between BioTAP and inference (model 1), a robust background model including multiple covariates that one might expect to explain some part of the variation in BioTAP (model 2), and a combined model including all variables (model 3). As model 3 shows, the covariates explain very little variation in BioTAP scores, and the relationship between inference and BioTAP persists even in the presence of all of the covariates.

Partial sum (questions 1–5) of BioTAP scores ( n = 52)

VariableModel 1Model 2Model 3
CCTST inference0.536***0.491**
Grade point average0.1760.092
Independent study courses−0.0870.001
Writing-intensive courses0.1310.021
Institution0.3290.115
Male0.0850.041
Underrepresented group−0.114−0.060
Adjusted 0.273−0. 0220.195

** p < 0.01.

*** p < 0.001.

The aim of this study was to examine the extent to which the various components of scientific reasoning—manifested in writing in the genre of undergraduate thesis and assessed using BioTAP—draw on general and specific critical-thinking skills (assessed using CCTST) and to consider the implications for educational practices. Although science reasoning involves critical-thinking skills, it also relates to conceptual knowledge and the epistemological foundations of science disciplines ( Kuhn et al. , 2008 ). Moreover, science reasoning in writing , captured in students’ undergraduate theses, reflects habits, conventions, and the incorporation of feedback that may alter evidence of individuals’ critical-thinking skills. Our findings, however, provide empirical evidence that cumulative measures of science reasoning in writing are nonetheless related to students’ overall critical-thinking skills ( Table 3 ). The particularly significant roles of inference skills ( Table 3 ) and the discussion of implications of results (BioTAP question 5; Table 4 ) provide a basis for more specific ideas about how these constructs relate to one another and what educational interventions may have the most success in fostering these skills.

Our results build on previous findings. The genre of thesis writing combines pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). Quitadamo and Kurtz (2007) reported that students who engaged in a laboratory writing component in a general education biology course significantly improved their inference and analysis skills, and Quitadamo and colleagues (2008) found that participation in a community-based inquiry biology course (that included a writing component) was associated with significant gains in students’ inference and evaluation skills. The shared focus on inference is noteworthy, because these prior studies actually differ from the current study; the former considered critical-­thinking skills as the primary learning outcome of writing-­focused interventions, whereas the latter focused on emergent links between two learning outcomes (science reasoning in writing and critical thinking). In other words, inference skills are impacted by writing as well as manifested in writing.

Inference focuses on drawing conclusions from argument and evidence. According to the consensus definition of critical thinking, the specific skill of inference includes several processes: querying evidence, conjecturing alternatives, and drawing conclusions. All of these activities are central to the independent research at the core of writing an undergraduate thesis. Indeed, a critical part of what we call “science reasoning in writing” might be characterized as a measure of students’ ability to infer and make meaning of information and findings. Because the cumulative BioTAP measures distill underlying similarities and, to an extent, suppress unique aspects of individual dimensions, we argue that it is appropriate to relate inference to scientific reasoning in writing . Even when we control for other potentially relevant background characteristics, the relationship is strong ( Table 5 ).

In taking the complementary view and focusing on BioTAP, when we compared students who exhibit mastery with those who do not, we found that the specific dimension of “discussing the implications of results” (question 5) differentiates students’ performance on several critical-thinking skills. To achieve mastery on this dimension, students must make connections between their results and other published studies and discuss the future directions of the research; in short, they must demonstrate an understanding of the bigger picture. The specific relationship between question 5 and inference is the strongest observed among all individual comparisons. Altogether, perhaps more than any other BioTAP dimension, this aspect of students’ writing provides a clear view of the role of students’ critical-thinking skills (particularly inference and, marginally, induction) in science reasoning.

While inference and discussion of implications emerge as particularly strongly related dimensions in this work, we note that the strongest contribution to “science reasoning in writing in biology,” as determined through exploratory factor analysis, is “argument for the significance of research” (BioTAP question 2, not question 5; Dowd et al. , 2016 ). Question 2 is not clearly related to critical-thinking skills. These findings are not contradictory, but rather suggest that the epistemological and disciplinary-specific aspects of science reasoning that emerge in writing through BioTAP are not completely aligned with aspects related to critical thinking. In other words, science reasoning in writing is not simply a proxy for those critical-thinking skills that play a role in science reasoning.

In a similar vein, the content-related, epistemological aspects of science reasoning, as well as the conventions associated with writing the undergraduate thesis (including feedback from peers and revision), may explain the lack of significant relationships between some science reasoning dimensions and some critical-thinking skills that might otherwise seem counterintuitive (e.g., BioTAP question 2, which relates to making an argument, and the critical-thinking skill of argument). It is possible that an individual’s critical-thinking skills may explain some variation in a particular BioTAP dimension, but other aspects of science reasoning and practice exert much stronger influence. Although these relationships do not emerge in our analyses, the lack of significant correlation does not mean that there is definitively no correlation. Correcting for multiple comparisons suppresses type 1 error at the expense of exacerbating type 2 error, which, combined with the limited sample size, constrains statistical power and makes weak relationships more difficult to detect. Ultimately, though, the relationships that do emerge highlight places where individuals’ distinct critical-thinking skills emerge most coherently in thesis assessment, which is why we are particularly interested in unpacking those relationships.

We recognize that, because only honors students submit theses at these institutions, this study sample is composed of a selective subset of the larger population of biology majors. Although this is an inherent limitation of focusing on thesis writing, links between our findings and results of other studies (with different populations) suggest that observed relationships may occur more broadly. The goal of improved science reasoning and critical thinking is shared among all biology majors, particularly those engaged in capstone research experiences. So while the implications of this work most directly apply to honors thesis writers, we provisionally suggest that all students could benefit from further study of them.

There are several important implications of this study for science education practices. Students’ inference skills relate to the understanding and effective application of scientific content. The fact that we find no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions suggests that such mid- to lower-order elements of BioTAP ( Reynolds et al. , 2009 ), which tend to be more structural in nature, do not focus on aspects of the finished thesis that draw strongly on critical thinking. In keeping with prior analyses ( Reynolds and Thompson, 2011 ; Dowd et al. , 2016 ), these findings further reinforce the notion that disciplinary instructors, who are most capable of teaching and assessing scientific reasoning and perhaps least interested in the more mechanical aspects of writing, may nonetheless be best suited to effectively model and assess students’ writing.

The goal of the thesis writing course at both Duke University and the University of Minnesota is not merely to improve thesis scores but to move students’ writing into the category of mastery across BioTAP dimensions. Recognizing that students with differing critical-thinking skills (particularly inference) are more or less likely to achieve mastery in the undergraduate thesis (particularly in discussing implications [question 5]) is important for developing and testing targeted pedagogical interventions to improve learning outcomes for all students.

The competencies characterized by the Vision and Change in Undergraduate Biology Education Initiative provide a general framework for recognizing that science reasoning and critical-thinking skills play key roles in major learning outcomes of science education. Our findings highlight places where science reasoning–related competencies (like “understanding the process of science”) connect to critical-thinking skills and places where critical thinking–related competencies might be manifested in scientific products (such as the ability to discuss implications in scientific writing). We encourage broader efforts to build empirical connections between competencies and pedagogical practices to further improve science education.

One specific implication of this work for science education is to focus on providing opportunities for students to develop their critical-thinking skills (particularly inference). Of course, as this correlational study is not designed to test causality, we do not claim that enhancing students’ inference skills will improve science reasoning in writing. However, as prior work shows that science writing activities influence students’ inference skills ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ), there is reason to test such a hypothesis. Nevertheless, the focus must extend beyond inference as an isolated skill; rather, it is important to relate inference to the foundations of the scientific method ( Miri et al. , 2007 ) in terms of the epistemological appreciation of the functions and coordination of evidence ( Kuhn and Dean, 2004 ; Zeineddin and Abd-El-Khalick, 2010 ; Ding et al. , 2016 ) and disciplinary paradigms of truth and justification ( Moshman, 2015 ).

Although this study is limited to the domain of biology at two institutions with a relatively small number of students, the findings represent a foundational step in the direction of achieving success with more integrated learning outcomes. Hopefully, it will spur greater interest in empirically grounding discussions of the constructs of scientific reasoning and critical-thinking skills.

This study contributes to the efforts to improve science education, for both majors and nonmajors, through an empirically driven analysis of the relationships between scientific reasoning reflected in the genre of thesis writing and critical-thinking skills. This work is rooted in the usefulness of BioTAP as a method 1) to facilitate communication and learning and 2) to assess disciplinary-specific and general dimensions of science reasoning. The findings support the important role of the critical-thinking skill of inference in scientific reasoning in writing, while also highlighting ways in which other aspects of science reasoning (epistemological considerations, writing conventions, etc.) are not significantly related to critical thinking. Future research into the impact of interventions focused on specific critical-thinking skills (i.e., inference) for improved science reasoning in writing will build on this work and its implications for science education.

Supplementary Material

Acknowledgments.

We acknowledge the contributions of Kelaine Haas and Alexander Motten to the implementation and collection of data. We also thank Mine Çetinkaya-­Rundel for her insights regarding our statistical analyses. This research was funded by National Science Foundation award DUE-1525602.

  • American Association for the Advancement of Science. (2011). Vision and change in undergraduate biology education: A call to action . Washington, DC: Retrieved September 26, 2017, from https://visionandchange.org/files/2013/11/aaas-VISchange-web1113.pdf . [ Google Scholar ]
  • August D. (2016). California Critical Thinking Skills Test user manual and resource guide . San Jose: Insight Assessment/California Academic Press. [ Google Scholar ]
  • Beyer C. H., Taylor E., Gillmore G. M. (2013). Inside the undergraduate teaching experience: The University of Washington’s growth in faculty teaching study . Albany, NY: SUNY Press. [ Google Scholar ]
  • Bissell A. N., Lemons P. P. (2006). A new method for assessing critical thinking in the classroom . BioScience , ( 1 ), 66–72. https://doi.org/10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2 . [ Google Scholar ]
  • Blattner N. H., Frazier C. L. (2002). Developing a performance-based assessment of students’ critical thinking skills . Assessing Writing , ( 1 ), 47–64. [ Google Scholar ]
  • Clase K. L., Gundlach E., Pelaez N. J. (2010). Calibrated peer review for computer-assisted learning of biological research competencies . Biochemistry and Molecular Biology Education , ( 5 ), 290–295. [ PubMed ] [ Google Scholar ]
  • Condon W., Kelly-Riley D. (2004). Assessing and teaching what we value: The relationship between college-level writing and critical thinking abilities . Assessing Writing , ( 1 ), 56–75. https://doi.org/10.1016/j.asw.2004.01.003 . [ Google Scholar ]
  • Ding L., Wei X., Liu X. (2016). Variations in university students’ scientific reasoning skills across majors, years, and types of institutions . Research in Science Education , ( 5 ), 613–632. https://doi.org/10.1007/s11165-015-9473-y . [ Google Scholar ]
  • Dowd J. E., Connolly M. P., Thompson R. J., Jr., Reynolds J. A. (2015a). Improved reasoning in undergraduate writing through structured workshops . Journal of Economic Education , ( 1 ), 14–27. https://doi.org/10.1080/00220485.2014.978924 . [ Google Scholar ]
  • Dowd J. E., Roy C. P., Thompson R. J., Jr., Reynolds J. A. (2015b). “On course” for supporting expanded participation and improving scientific reasoning in undergraduate thesis writing . Journal of Chemical Education , ( 1 ), 39–45. https://doi.org/10.1021/ed500298r . [ Google Scholar ]
  • Dowd J. E., Thompson R. J., Jr., Reynolds J. A. (2016). Quantitative genre analysis of undergraduate theses: Uncovering different ways of writing and thinking in science disciplines . WAC Journal , , 36–51. [ Google Scholar ]
  • Facione P. A. (1990). Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations . Newark, DE: American Philosophical Association; Retrieved September 26, 2017, from https://philpapers.org/archive/FACCTA.pdf . [ Google Scholar ]
  • Gerdeman R. D., Russell A. A., Worden K. J., Gerdeman R. D., Russell A. A., Worden K. J. (2007). Web-based student writing and reviewing in a large biology lecture course . Journal of College Science Teaching , ( 5 ), 46–52. [ Google Scholar ]
  • Greenhoot A. F., Semb G., Colombo J., Schreiber T. (2004). Prior beliefs and methodological concepts in scientific reasoning . Applied Cognitive Psychology , ( 2 ), 203–221. https://doi.org/10.1002/acp.959 . [ Google Scholar ]
  • Haaga D. A. F. (1993). Peer review of term papers in graduate psychology courses . Teaching of Psychology , ( 1 ), 28–32. https://doi.org/10.1207/s15328023top2001_5 . [ Google Scholar ]
  • Halonen J. S., Bosack T., Clay S., McCarthy M., Dunn D. S., Hill G. W., Whitlock K. (2003). A rubric for learning, teaching, and assessing scientific inquiry in psychology . Teaching of Psychology , ( 3 ), 196–208. https://doi.org/10.1207/S15328023TOP3003_01 . [ Google Scholar ]
  • Hand B., Keys C. W. (1999). Inquiry investigation . Science Teacher , ( 4 ), 27–29. [ Google Scholar ]
  • Holm S. (1979). A simple sequentially rejective multiple test procedure . Scandinavian Journal of Statistics , ( 2 ), 65–70. [ Google Scholar ]
  • Holyoak K. J., Morrison R. G. (2005). The Cambridge handbook of thinking and reasoning . New York: Cambridge University Press. [ Google Scholar ]
  • Insight Assessment. (2016a). California Critical Thinking Skills Test (CCTST) Retrieved September 26, 2017, from www.insightassessment.com/Products/Products-Summary/Critical-Thinking-Skills-Tests/California-Critical-Thinking-Skills-Test-CCTST .
  • Insight Assessment. (2016b). Sample thinking skills questions. Retrieved September 26, 2017, from www.insightassessment.com/Resources/Teaching-Training-and-Learning-Tools/node_1487 .
  • Kelly G. J., Takao A. (2002). Epistemic levels in argument: An analysis of university oceanography students’ use of evidence in writing . Science Education , ( 3 ), 314–342. https://doi.org/10.1002/sce.10024 . [ Google Scholar ]
  • Kuhn D., Dean D., Jr. (2004). Connecting scientific reasoning and causal inference . Journal of Cognition and Development , ( 2 ), 261–288. https://doi.org/10.1207/s15327647jcd0502_5 . [ Google Scholar ]
  • Kuhn D., Iordanou K., Pease M., Wirkala C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? . Cognitive Development , ( 4 ), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 . [ Google Scholar ]
  • Lawson A. E. (2010). Basic inferences of scientific reasoning, argumentation, and discovery . Science Education , ( 2 ), 336–364. https://doi.org/­10.1002/sce.20357 . [ Google Scholar ]
  • Meizlish D., LaVaque-Manty D., Silver N., Kaplan M. (2013). Think like/write like: Metacognitive strategies to foster students’ development as disciplinary thinkers and writers . In Thompson R. J. (Ed.), Changing the conversation about higher education (pp. 53–73). Lanham, MD: Rowman & Littlefield. [ Google Scholar ]
  • Miri B., David B.-C., Uri Z. (2007). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking . Research in Science Education , ( 4 ), 353–369. https://doi.org/10.1007/s11165-006-9029-2 . [ Google Scholar ]
  • Moshman D. (2015). Epistemic cognition and development: The psychology of justification and truth . New York: Psychology Press. [ Google Scholar ]
  • National Research Council. (2000). How people learn: Brain, mind, experience, and school . Expanded ed. Washington, DC: National Academies Press. [ Google Scholar ]
  • Pukkila P. J. (2004). Introducing student inquiry in large introductory genetics classes . Genetics , ( 1 ), 11–18. https://doi.org/10.1534/genetics.166.1.11 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Quitadamo I. J., Faiola C. L., Johnson J. E., Kurtz M. J. (2008). Community-based inquiry improves critical thinking in general education biology . CBE—Life Sciences Education , ( 3 ), 327–337. https://doi.org/10.1187/cbe.07-11-0097 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Quitadamo I. J., Kurtz M. J. (2007). Learning to improve: Using writing to increase critical thinking performance in general education biology . CBE—Life Sciences Education , ( 2 ), 140–154. https://doi.org/10.1187/cbe.06-11-0203 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reynolds J. A., Smith R., Moskovitz C., Sayle A. (2009). BioTAP: A systematic approach to teaching scientific writing and evaluating undergraduate theses . BioScience , ( 10 ), 896–903. https://doi.org/10.1525/bio.2009.59.10.11 . [ Google Scholar ]
  • Reynolds J. A., Thaiss C., Katkin W., Thompson R. J. (2012). Writing-to-learn in undergraduate science education: A community-based, conceptually driven approach . CBE—Life Sciences Education , ( 1 ), 17–25. https://doi.org/10.1187/cbe.11-08-0064 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reynolds J. A., Thompson R. J. (2011). Want to improve undergraduate thesis writing? Engage students and their faculty readers in scientific peer review . CBE—Life Sciences Education , ( 2 ), 209–215. https://doi.org/­10.1187/cbe.10-10-0127 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rhemtulla M., Brosseau-Liard P. E., Savalei V. (2012). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions . Psychological Methods , ( 3 ), 354–373. https://doi.org/­10.1037/a0029315 . [ PubMed ] [ Google Scholar ]
  • Stephenson N. S., Sadler-McKnight N. P. (2016). Developing critical thinking skills using the science writing heuristic in the chemistry laboratory . Chemistry Education Research and Practice , ( 1 ), 72–79. https://doi.org/­10.1039/C5RP00102A . [ Google Scholar ]
  • Tariq V. N., Stefani L. A. J., Butcher A. C., Heylings D. J. A. (1998). Developing a new approach to the assessment of project work . Assessment and Evaluation in Higher Education , ( 3 ), 221–240. https://doi.org/­10.1080/0260293980230301 . [ Google Scholar ]
  • Timmerman B. E. C., Strickland D. C., Johnson R. L., Payne J. R. (2011). Development of a “universal” rubric for assessing undergraduates’ scientific reasoning skills using scientific writing . Assessment and Evaluation in Higher Education , ( 5 ), 509–547. https://doi.org/10.1080/­02602930903540991 . [ Google Scholar ]
  • Topping K. J., Smith E. F., Swanson I., Elliot A. (2000). Formative peer assessment of academic writing between postgraduate students . Assessment and Evaluation in Higher Education , ( 2 ), 149–169. https://doi.org/10.1080/713611428 . [ Google Scholar ]
  • Willison J., O’Regan K. (2007). Commonly known, commonly not known, totally unknown: A framework for students becoming researchers . Higher Education Research and Development , ( 4 ), 393–409. https://doi.org/10.1080/07294360701658609 . [ Google Scholar ]
  • Woodin T., Carter V. C., Fletcher L. (2010). Vision and Change in Biology Undergraduate Education: A Call for Action—Initial responses . CBE—Life Sciences Education , ( 2 ), 71–73. https://doi.org/10.1187/cbe.10-03-0044 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Zeineddin A., Abd-El-Khalick F. (2010). Scientific reasoning and epistemological commitments: Coordination of theory and evidence among college science students . Journal of Research in Science Teaching , ( 9 ), 1064–1093. https://doi.org/10.1002/tea.20368 . [ Google Scholar ]
  • Zimmerman C. (2000). The development of scientific reasoning skills . Developmental Review , ( 1 ), 99–149. https://doi.org/10.1006/drev.1999.0497 . [ Google Scholar ]
  • Zimmerman C. (2007). The development of scientific thinking skills in elementary and middle school . Developmental Review , ( 2 ), 172–223. https://doi.org/10.1016/j.dr.2006.12.001 . [ Google Scholar ]

critical thinking is crucial to understanding psychological science

  • The Open University
  • Accessibility hub
  • Guest user / Sign out
  • Study with The Open University

My OpenLearn Profile

Personalise your OpenLearn profile, save your favourite content and get recognition for your learning

About this free course

Become an ou student, download this course, share this free course.

Critically exploring psychology

Start this free course now. Just create an account and sign in. Enrol and complete the course for a free statement of participation or digital badge if available.

2.1 What is critical thinking?

Critical thinking is a form of making a judgement; it is not about being negative. It is something that most people do, daily, often with little awareness of the process they are going through. In simple terms, an example of everyday critical thinking is, I’m going hiking today, should I wear trainers or sandals? Critical thinking involves making an assessment of something, and then providing a critique of that position and putting forward new positions. For example, flip flops may be comfortable for the first part of the hike, in hot weather. However, the top of the mountain is rocky so a more substantial trainer might be needed to get to the summit and protect your toes.

A pair of flip flops and a pair of trainers

There are different stages to critical thinking, but they follow broadly similar steps. Firstly, you need to understand the issue at hand and the problem that is being faced or needs to be solved, and why? Secondly, it is necessary to carry out some form of analysis or collect some evidence about possible ways to understand the issue. For example, when do I need to solve the problem by? What resources do I have available to solve it? What happens if I use method A or method B to solve it? Is there a method C that would solve it more effectively? Thirdly, on the basis of the analysis, an evaluation is carried out, and finally a judgement is made about which way to progress. The advantages of working through these steps is that it widens thinking about a situation or issue, and opens up opportunities to different possible outcomes and solutions.

Flow chart showing and explaining the four stages of critical thinking: understand, anayse, evaluate and judge

The four stages of critical thinking

  • Understand: what is the problem that needs to be solved, and why?
  • Analyse: when do I need to solve the problem by? What resources do I have to solve it? What happens if I use method A or method B to solve it? Is there a method C that would solve it more effectively?
  • Evaluate: based on your analysis you should make an evaluation.
  • Judge: based on your analysis and evaluation, how will you proceed?

Elder and Paul (2012) describe a ‘well cultivated critical thinker’ as someone who:

  • raises vital questions and problems, formulating them clearly and precisely
  • gathers and assesses relevant information, using abstract ideas to interpret it effectively
  • comes to well-reasoned conclusions and solutions, testing them against relevant criteria and standards
  • thinks open-mindedly within alternative systems of thought, recognising and assessing, as need be, their assumptions, implications, and practical consequences; and
  • communicates effectively with others in figuring out solutions to complex problems.

Why is critical thinking important to psychology and research methods?

Critical thinking enables the researcher to go through the process of recognising their assumptions, challenging them and looking at possible other ways to do something.

In applying critical thinking to research, you will understand that there are different types of research questions; and that these different types of questions require different types of research designs (and consequently different methods) to answer them. If the question and the design do not correspond, then the conclusions that are made about the research are likely to be questionable at best, and probably wrong.

Now you have a better understanding of what critical thinking is, you will move onto look at a framework for developing research questions.

Previous

APS

Teaching: On the Benefits of Critical Ignoring

  • Current Directions in Psychological Science
  • Social Media
  • Teaching Current Directions

critical thinking is crucial to understanding psychological science

Aimed at integrating cutting-edge psychological science into the classroom, Teaching Current Directions in Psychological Science offers advice and how-to guidance about teaching a particular area of research or topic in psychological science that has been the focus of an article in the APS journal  Current Directions in Psychological Science .

Also see Teaching: The Unexpected Pleasure of Doing Good .

Kozyreva, A., Wineburg, S., Lewandowsky, S., & Hertwig, R. (2022). Critical Ignoring as a Core Competence for Digital Citizens.  Current Directions in Psychological Science ,  0 (0).

People study psychology to understand themselves, others, and their global community. To foster such understanding, psychological scientists teach critical thinking. Using the scientific method, students learn how to approach competing ideas with an analytical mindset—leading them to rely on evidence rather than anecdote or intuition.  

According to Anastasia Kozyreva, Sam Wineburg, and APS Fellows Stephan Lewandowsky and Ralph Hertwig (2022), psychological scientists should also teach critical ignoring , defined as “choosing what to ignore and where to invest one’s limited attentional capacities.” The proliferation of digital information has given people greater access to information. Yet there are few checks and balances to separate false and misleading information from the truth. Thus, people must become smart ignorers of information (Hertwig & Engel, 2016).  

To engage in critical ignoring, Kozyreva and colleagues encourage people to use three approaches:  

  • Self-nudging: Redesign your environment to limit the temptation to consume unvetted information (Reijula & Hertwig, 2022). For example, people can use apps or web browser extensions that restrict their use of social media.   
  • Lateral reading: Ignore people or organizations who refuse to let you cross-check their claims with independent verification from third parties, especially those that may be inclined to disagree with them. Start by verifying information from its original source. Next, seek out independent verification from other sources. Professional fact-checkers commonly use lateral reading (Wineburg & McGrew, 2019). Don’t invest your limited attentional resources listening to people or organizations that refuse to have their claims verified.  
  • “Do Not Feed the Trolls”: Avoid trolls—people who seek to deceive or harm others online. Trolls score highly on sadism (example item: “I enjoy hurting people”), psychopathy (“Payback needs to be quick and nasty”), and Machiavellianism (“It’s not wise to tell your secrets”). Trolls also find annoying and upsetting others reinforcing (Craker & March, 2016). Any time you catch a whiff of someone engaging in troll-like behavior, block or ignore them.  

Ask students to select two topics from the following list that they feel comfortable thinking about:  

  • Student debt crisis  
  • Cancel culture 
  • Critical race theory  
  • Gun control 
  • Mandatory Covid-19 vaccines 
  • Abortion 
  • White supremacy 
  • Animal rights 
  • Climate change 
  • Immigration reform 

Tell students about the importance of critical ignoring, which means ignoring some information and investing their limited attentional resources elsewhere. Next, ask students to consider how they can use the three essential strategies of ignoring to understand better the topics they chose: 

  • Self-nudging: How can students redesign their environment to limit the temptation to consume unvetted information related to their chosen topics? How could they limit the time they spend on certain websites? Should they avoid some websites? Why? What other actions can they take to make these environmental changes?   
  • Lateral reading: Ask students to cross-check information from a source (author, organization) and determine whether it conforms to information elsewhere (e.g., Wikipedia, Our World in Data). Students can start by verifying information from its source and then seek independent verification from other sources. Encourage students to try to verify information from independent sources that are either neutral or opposed to a particular position. How does lateral reading help students pay attention to the most relevant information? How might their learning experience differ if they decided to ignore people or organizations that refused to have their claims confirmed by independent sources? 
  • “Do Not Feed the Trolls”: How might students avoid online trolls related to their topics? Have they blocked or ignored trolls in the past? Why might ignoring trolls be the safest and most effective strategy?  

Psychological scientists need to teach both critical thinking and critical ignoring. Faced with a deluge of digital information, people need guidance on deciding what information to ignore and where to invest their limited attentional resources. By self-nudging, lateral reading, and starving online trolls, people can better understand themselves, their fellows, and their global community.  

Feedback on this article? Email  [email protected]  or login to comment. Interested in writing for us? Read our  contributor guidelines .

References  

Craker, N., & March, E. (2016). The dark side of Facebook®: The Dark Tetrad, negative social potency, and trolling behaviors. Personality and Individual Differences , 102 , 79–84. https://doi.org/10.1016/j.paid.2016.06.043   

Hertwig, R., & Engel, C. (2016). Homo ignorans: Deliberately choosing not to know. Perspectives on Psychological Science , 11 (3), 359–372. https://doi.org/10.1177/1745691616635594  

Reijula, S., & Hertwig, R. (2022). Self-nudging and the citizen choice architect. Behavioural Public Policy , 6 (1), 119–149. https://doi.org/10.1017/bpp.2020.5   

Wineburg, S., & McGrew, S. (2019). Lateral reading and the nature of expertise: Reading less and learning more when evaluating digital information. Teachers College Record , 121 (11), 1–40. https://www.tcrecord.org/Content.asp?ContentId=22806  

APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines .

Please login with your APS account to comment.

About the Authors

APS Fellow  C. Nathan DeWall  is a professor of psychology at the University of Kentucky. His research interests include social acceptance and rejection, self-control, and aggression. DeWall can be contacted at  [email protected] . 

critical thinking is crucial to understanding psychological science

Empirical Evidence Is My Love Language 

Teaching: The idea of love languages has become hugely popular and the term itself is pervasive in popular culture. This article provides teaching materials to encourage students to think critically about psychological science and popular self-help advice.

critical thinking is crucial to understanding psychological science

Deconstructing Entrepreneurial Discovery

An adaption of a 2022 preprint article published in Technovation , this article explores how alertness might be related to entrepreneurial discovery and whether positivity or negativity are more associated with alertness.

critical thinking is crucial to understanding psychological science

Research Briefs

Recent highlights from APS journals articles on learning about the self, mental health interventions, representational momentum for physical states, and much more.

Privacy Overview

CookieDurationDescription
__cf_bm30 minutesThis cookie, set by Cloudflare, is used to support Cloudflare Bot Management.
CookieDurationDescription
AWSELBCORS5 minutesThis cookie is used by Elastic Load Balancing from Amazon Web Services to effectively balance load on the servers.
CookieDurationDescription
at-randneverAddThis sets this cookie to track page visits, sources of traffic and share counts.
CONSENT2 yearsYouTube sets this cookie via embedded youtube-videos and registers anonymous statistical data.
uvc1 year 27 daysSet by addthis.com to determine the usage of addthis.com service.
_ga2 yearsThe _ga cookie, installed by Google Analytics, calculates visitor, session and campaign data and also keeps track of site usage for the site's analytics report. The cookie stores information anonymously and assigns a randomly generated number to recognize unique visitors.
_gat_gtag_UA_3507334_11 minuteSet by Google to distinguish users.
_gid1 dayInstalled by Google Analytics, _gid cookie stores information on how visitors use a website, while also creating an analytics report of the website's performance. Some of the data that are collected include the number of visitors, their source, and the pages they visit anonymously.
CookieDurationDescription
loc1 year 27 daysAddThis sets this geolocation cookie to help understand the location of users who share the information.
VISITOR_INFO1_LIVE5 months 27 daysA cookie set by YouTube to measure bandwidth that determines whether the user gets the new or old player interface.
YSCsessionYSC cookie is set by Youtube and is used to track the views of embedded videos on Youtube pages.
yt-remote-connected-devicesneverYouTube sets this cookie to store the video preferences of the user using embedded YouTube video.
yt-remote-device-idneverYouTube sets this cookie to store the video preferences of the user using embedded YouTube video.
yt.innertube::nextIdneverThis cookie, set by YouTube, registers a unique ID to store data on what videos from YouTube the user has seen.
yt.innertube::requestsneverThis cookie, set by YouTube, registers a unique ID to store data on what videos from YouTube the user has seen.

Dan Bates, LMHC, LPC, NCC

Thinking Philosophically Can Benefit Mental Health

Reflecting on the deeper side of life bolsters mental health..

Updated August 6, 2024 | Reviewed by Michelle Quirk

  • Philosophy and mental health intersect through practical wisdom and reflection.
  • Wisdom aids emotional regulation, empathy, and resilience in life's challenges.
  • Counseling applies philosophical concepts like Stoicism for stress management.

Source: Luis Fernandez / Pexels

The Intersection of Philosophy and Mental Health

At first glance, philosophy and mental health might seem like unrelated fields. Philosophy often examines abstract, theoretical, and logical questions, constructing arguments that seem detached from the day-to-day concerns of most people, while the field of mental health and counseling deals with subjectivity, the nitty-gritty of everyday life. In particular, counseling is concerned with the implementation of evidence-based treatments that have been customized to the actual needs of an individual. In that way, counseling is intensely pragmatic and practical. Yet, the relationship between these two fields is more intertwined than one might initially think. What I'm proposing is that philosophy can actually benefit your mental health and promoting well-being.

From my perspective as a counselor, many of the insights that foster mental well-being in clients are akin to pieces of wisdom . The term "philosophy" itself originates from the Greek words for "love of wisdom." Wisdom serves as a bridge between philosophical reflection and mental health, providing practical guidance that can be transformative.

Source: Pixabay / Pexels

The Role of Wisdom in Mental Health

Wisdom is not merely an abstract concept but a practical, experience-based understanding of life. It involves a deep comprehension of life's complexities, an ability to manage and regulate emotions, and a capacity for empathy and perspective-taking . Recent research suggests that wisdom is closely linked to mental health, as it helps individuals navigate life's challenges more effectively (Jeste & Lee, 2019).

Jeste and Lee (2019) go on to argue that philosophical wisdom—derived from practical, lived experiences—can play a crucial role in enhancing mental health. This wisdom, which involves understanding life's complexities, managing emotions, and empathizing with others, helps individuals navigate challenges and find meaning. They highlight the use of philosophical ideas in therapeutic practices, such as existential and Stoic philosophy, to aid clients in understanding and addressing their mental health concerns. Reflective practices and narrative therapy are also discussed as methods that incorporate philosophical concepts to promote personal growth and emotional well-being.

This is my personal perspective, but I see counseling as a very practical form of philosophy. One branch of philosophy is concerned with “the good life.” Well, so is counseling, but with an extra serving of subjectivity. Counseling is concerned with “what is the good life” for this person, in their culture, in their context and situation, with their life goals in mind.

According to Jeste and Lee (2019), wisdom is defined as a complex human trait characterized by a deep understanding of life, including knowledge of what is important, emotional regulation , empathy, compassion, reflection, decisiveness, and tolerance for divergent values. This definition emphasizes that wisdom involves both cognitive and emotional elements, making it a holistic quality that enables individuals to deal with life's challenges effectively. Wisdom also encompasses practical application, allowing for better decision-making and enhanced well-being.

Sense-Making

A practical, real-world form of philosophy emerges from the experiences of life, particularly from suffering and adversity. This type of philosophy isn't about detached contemplation but about deriving meaningful insights through lived experiences. It aligns with the concept of "grounded theory" in psychology, which posits that knowledge and understanding arise from the reality of individuals' lived experiences (Charmaz, 2014).

Much of what counseling does, when working with someone who has experienced trauma and suffering, is helping them make sense of their experience. Sense-making is a crucial process for individuals recovering from hardships, as it involves reflecting on and interpreting challenging experiences to integrate them into one's personal narrative. This process helps create a coherent story that connects past events with current identity , fostering a greater sense of purpose and self-awareness. Positive reframing, such as focusing on gratitude , allows individuals to acknowledge the growth and resilience gained from their struggles, rather than merely dwelling on the difficulties. This approach enhances resilience, providing coping mechanisms like positive self-talk , social support, and self-care, which better equip individuals to handle future challenges. Ultimately, sense-making transforms adversity into opportunities for growth and improved well-being ​​.

Practical Philosophy and Counseling

In counseling, philosophical ideas often manifest as guiding principles or frameworks that help clients make sense of their experiences. For instance, existential philosophy, with its focus on meaning, choice, and responsibility, can offer valuable perspectives in therapy. Clients grappling with existential concerns—such as the search for meaning or the fear of mortality—can find solace and clarity in existentialist thought (Yalom, 1980).

critical thinking is crucial to understanding psychological science

Similarly, Stoic philosophy, with its emphasis on resilience and control, provides practical tools for managing emotions and stress . The Stoic idea of focusing on what is within one's control and accepting what is not can be particularly empowering for individuals facing anxiety and uncertainty. This approach aligns with cognitive-behavioral therapy (CBT) techniques that encourage cognitive restructuring to manage negative thoughts (Ellis, 2004).

Reflecting on Experience

The process of reflecting on one's experiences, a key aspect of practical philosophy, is crucial for mental health. Reflective practices enable individuals to gain insight into their thoughts, feelings, and behaviors, facilitating personal growth and emotional regulation. This reflective capacity is linked to mindfulness practices, which have been shown to reduce symptoms of anxiety and depression (Kabat-Zinn, 2003).

Furthermore, narrative therapy, which involves exploring and re-authoring personal stories, draws on philosophical ideas about identity and meaning. By examining the narratives they live by, clients can reshape their understanding of themselves and their circumstances, leading to positive changes in their mental health (White & Epston, 1990).

In summary, philosophy and mental health are deeply interconnected. The practical wisdom derived from philosophical reflection offers valuable insights and tools for navigating life's challenges. By integrating philosophical concepts into counseling, individuals can gain a deeper understanding of themselves and their experiences, fostering greater mental well-being. As research continues to explore the intersections of these fields, the benefits of philosophy for mental health are becoming increasingly evident.

Charmaz, K. (2014). Constructing Grounded Theory . SAGE Publications.

Ellis, A. (2004). Rational Emotive Behavior Therapy: It Works for Me—It Can Work for You . Prometheus Books.

Jeste, D. V., & Lee, E. E. (2019). The emerging empirical science of wisdom: Definition, measurement, neurobiology, longevity, and interventions. Harvard Review of Psychiatry , 27 (3), 127–140.

Kabat-Zinn, J. (2003). Mindfulness-Based Interventions in Context: Past, Present, and Future . Clinical Psychology: Science and Practice, 10(2), 144–156.

White, M., & Epston, D. (1990). Narrative Means to Therapeutic Ends . Norton & Company.

Yalom, I. D. (1980). Existential Psychotherapy . Basic Books.

Dan Bates, LMHC, LPC, NCC

Dan Bates, Ph.D., is a clinical mental health counselor licensed in the state of Washington and certified nationally.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • International
  • New Zealand
  • South Africa
  • Switzerland
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

July 2024 magazine cover

Sticking up for yourself is no easy task. But there are concrete skills you can use to hone your assertiveness and advocate for yourself.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

IMAGES

  1. The Importance of Critical Thinking in Psychology

    critical thinking is crucial to understanding psychological science

  2. The 6 Stages of Critical Thinking Charles Leon

    critical thinking is crucial to understanding psychological science

  3. How to Improve Critical Thinking

    critical thinking is crucial to understanding psychological science

  4. Cultivating Critical Thinking in Science

    critical thinking is crucial to understanding psychological science

  5. CRITICAL THINKING STRATEGIES-PPT

    critical thinking is crucial to understanding psychological science

  6. What is critical thinking?

    critical thinking is crucial to understanding psychological science

COMMENTS

  1. Why is critical thinking important for Psychology students?

    Critical thinking is objective and requires you to analyse and evaluate information to form a sound judgement. It is a cornerstone of evidence-based arguments and forming an evidence-based argument is essential in Psychology. That is why we, your tutors, as well as your future employers, want you to develop this skill effectively.

  2. On Critical Thinking

    Theoretical Domain. Theoretical critical thinking involves helping the student develop an appreciation for scientific explanations of behavior. This means learning not just the content of psychology but how and why psychology is organized into concepts, principles, laws, and theories. Developing theoretical skills begins in the introductory ...

  3. A Brief Guide for Teaching and Assessing Critical Thinking in Psychology

    Instructional interventions affecting critical thinking skills and dispositions: A stage 1 meta-analysis. Review of Educational Research, 4, 1102-1134. Angelo, T. A. (1995). Classroom assessment for critical thinking. Teaching of Psychology, 22(1), 6-7. Bensley, D.A. (1998). Critical thinking in psychology: A unified skills approach.

  4. Critical Thinking: A Model of Intelligence for Solving Real-World

    4. Critical Thinking as an Applied Model for Intelligence. One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson (2020, p. 205): "the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life."

  5. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  6. Redefining Critical Thinking: Teaching Students to Think like

    Scientific thinking is the ability to generate, test, and evaluate claims, data, and theories (e.g., Bullock et al., 2009; Koerber et al., 2015 ). Simply stated, the basic tenets of scientific thinking provide students with the tools to distinguish good information from bad. Students have access to nearly limitless information, and the skills ...

  7. A Crash Course in Critical Thinking

    Here is a series of questions you can ask yourself to try to ensure that you are thinking critically. Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion ...

  8. Critical Thinking

    Critical thinking plays a vital role in various aspects of life, including education, personal and professional relationships, problem-solving, decision-making, and understanding complex issues. It enables individuals to think independently, make informed judgments, evaluate the reliability of information, and develop well-reasoned arguments.

  9. PDF Thinking Critically with Psychological Science

    Psychological Science1 How can we differentiate between uninformed opinions and examined conclusions? The science of psychology helps make these examined conclusions, which leads to our understanding of how people feel, think, and act as they do. 1One of the premier journals in our field is also called Psychological Science.

  10. Unit 04

    To appreciate why critical thinking is crucial to understanding psychological science, do the following: First, watch Crash Course's (2014) YouTube video, " Psychological Research ." Because the narrator of the video speaks quite rapidly, you might need to watch the video at least twice (or use the speed controller on YouTube).

  11. PDF Chapter 1: Thinking Critically With Psychological Science The Need for

    8-4 Identify the information we process automatically. 8-5 Explain how sensory memory works. 8-6 Describe the capacity of our short-term and working memory. 8-7 Describe the effortful processing strategies that can help us remember new information. 8-8 Describe the levels of processing and their effect on encoding.

  12. Unit 04: How to Think Critically about Psychological Science

    To appreciate why critical thinking is crucial to understanding psychological science, Watch Crash Course's (2014) YouTube, " Psychological Research . " Because the narrator of the video speaks quite rapidly, you might need to watch the video at least twice (or use the speed-controller in the settings options on YouTube).

  13. Understanding the Complex Relationship between Critical Thinking and

    This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, "understanding the process of science" requires students to engage in (and be metacognitive about) scientific reasoning, and having the "ability to interpret data" requires critical-thinking skills.

  14. Critical Thinking

    Critical Thinking, Cognitive Psychology of. D.F. Halpern, in International Encyclopedia of the Social & Behavioral Sciences, 2001. Critical thinking is the use of cognitive skills or strategies that increase the probability of a desirable outcome. Although much of the theory and research in critical thinking comes from cognitive psychology, it ...

  15. 3 Core Critical Thinking Skills Every Thinker Should Have

    To understand critical thinking skills and how they factor into critical thinking, one first needs a definition of the latter. Critical thinking (CT) is a metacognitive process, consisting of a ...

  16. Fighting Truthiness with Critical Thinking

    Critical thinking is the opposite of truthiness. According to Levy, critical thinking is a systematic cognitive strategy to examine, evaluate, and understand events. It involves solving problems and making decisions on the basis of sound reasoning and valid evidence. Levy identifies a variety of attitudes that characterize critical thinkers ...

  17. Critically exploring psychology: 2.1 What is critical thinking

    Critical thinking involves making an assessment of something, and then providing a critique of that position and putting forward new positions. For example, flip flops may be comfortable for the first part of the hike, in hot weather. However, the top of the mountain is rocky so a more substantial trainer might be needed to get to the summit ...

  18. PDF Thinking Critically With Psychological Science

    1-1. Explain how the three main components of the scientific attitude relate to critical thinking. 1-2. Describe some important milestones in psychology's early development. 1-3. Describe how psychology continued to develop from the 1920s through today. 1-4.

  19. PDF Thinking Critically With Psychological Science

    Critical Inquiry and Psychology† (p. 11) A Psychic Reading (p. 11) Astrology and the Scientific Method (p. 14) Student Project: Evaluating Media Reports of Research† (p. 15) UPDATED Student Projects/Classroom Exercises: Is Psychology a Science? (p. 10) NEW Testing Proverbs (p. 15) PsychSim 6: Understanding Psychological Research (p. 14 ...

  20. Think Critically Before Thinking Critically

    Modern scholars including Paul (1992), Facione (1990), and Ennis (1991, 1996) have argued that critical thinking involves dispositions such as being open-minded and intellectually flexible (i.e ...

  21. Teaching: On the Benefits of Critical Ignoring

    To foster such understanding, psychological scientists teach critical thinking. Using the scientific method, students learn how to approach competing ideas with an analytical mindset—leading them to rely on evidence rather than anecdote or intuition. According to Anastasia Kozyreva, Sam Wineburg, and APS Fellows Stephan Lewandowsky and Ralph ...

  22. PNSB PFA Recipients Discuss Studying abroad and internships

    PNSB Palau Fellowship Award Recipients Discuss Studying abroad and internships | August 7, 2024 | EPFM Studio

  23. Thinking Philosophically Can Benefit Mental Health

    According to Jeste and Lee (2019), wisdom is defined as a complex human trait characterized by a deep understanding of life, including knowledge of what is important, emotional regulation, empathy ...