Critical Thinking: 40 Useful Performance Feedback Phrases

Critical Thinking: Use these sample phrases to craft meaningful performance evaluations, drive change and motivate your workforce.

Critical Thinking is the ability to think clearly and rationally while understanding the logical connection between ideas in a reflective and independent thinking.

Critical Thinking: Exceeds Expectations Phrases

  • Highly demonstrates assertive and decisive ability when finding solutions for problems.
  • Knows how to communicate ideas and issues easily in a very clear and concise manner.
  • Able to piece together elements and come up with proper deductions on issues.
  • Knows how to clarify problems and solutions easily so that other people can understand.
  • Evaluates ideas and opinions in an unbiased manner without favoritism.
  • Thinks through issues in a very logical manner that results in finding the best solution to a problem.
  • Demonstrates excellent problem-solving skills by accessing a problem and devising the best possible solution for it.
  • Takes in into consideration different and varied perspectives when solving problems.
  • Examines the implications and possible consequences to any particular action carefully.
  • Solves problems one by one so as not to mix up issues and ideas.

Critical Thinking: Meets Expectations Phrases

  • Uses strategic approachability and skill when it comes to solving issues.
  • Demonstrates well assertive and decisive ability when it comes to handling problems.
  • Tries to always consider all factors at play before deciding on a particular methods or way.
  • Gathers all the required facts and figures before starting to solve a particular problem.
  • Always seeks to ask questions as a form of finding a sound basis to solving a problem.
  • Not afraid to make mistakes and tries to find creative ways to handle any issue.
  • Looks at issues in different angles and not in a one-sided way.
  • Shows great flexibility when it comes to changing strategies and tact while dealing with different problems.
  • Questions regularly to find out if the decision taken will achieve the desired effect.
  • Always feels comfortable and confident when seeking help or advice whenever stuck on solving any issue.

Critical Thinking: Needs Improvement Phrases

  • Does not take the time to consider all factors available before making a decision carefully.
  • Looks at issues in a one-sided manner instead of having different varied ways of looking at it.
  • Finds it challenging to arrive at a concrete conclusion after undertaking any evaluation.
  • Not willing to do proper research and relies on outdated data and information.
  • Does not demonstrate a curious type of attitude to try to find out a little bit more on issues.
  • Not willing to accept corrections and take calculated risks when necessary.
  • Does not show enough willingness to try to improve the critical thinking skills.
  • Does not present ideas and points in a logical order or outline.
  • Looks at issues in a biased and unfair way by not evaluating all factors.
  • Not willing and inflexible to change tact or strategy when the initial plan fails to achieve the desired effect

Critical Thinking: Self Evaluation Questions

  • How well do you research and gather facts and information before solving an issue?
  • Give an instance you hurriedly made a decision without thinking it through and what was the result?
  • Elaborate on a given occasion that you made the best decision. How did you feel about it?
  • How well do you consider all the factors available before making your decision?
  • Do you get to appreciate and learn from your mistakes and how do you deal with making wrong decisions?
  • Do you always try to inquire further, what could be the missing elements that could achieve a positive outcome?
  • How well do you consider having a concrete conclusion when presenting your ideas?
  • When expressing your ideas do you deliver them in a logical manner?
  • Do you usually look at issues in a one-sided manner or at different angles?
  • How flexible are you when it comes to trying different ways of solving problems other than the intended way.

These articles may interest you

Recent articles.

  • Outstanding Employee Performance Feedback: Collections Clerk
  • Employee Appreciation. Top 100 Quotes
  • Employee Performance Goals Sample: Biofuels Processing Technician
  • Employee Retention Explained: Meaning And Definition
  • Skills needed to be a supplier quality engineer
  • Outstanding Employee Performance Feedback: Collections Management Archivist
  • Skills needed to be a credit manager
  • Good Employee Performance Feedback: Master Control Operator (MCO)
  • Good Employee Performance Feedback: Architectural Intern
  • Outstanding Employee Performance Feedback: Financial Crime
  • Employee Termination Letter Due To Absenteeism
  • Outstanding Employee Performance Feedback: Film Archivist
  • Skills needed to be a law clerk
  • Employee Engagement Action Plan Template
  • Poor Employee Performance Feedback: Utility Plant Operator

Critical Thinking Definition, Skills, and Examples

  • Homework Help
  • Private School
  • College Admissions
  • College Life
  • Graduate School
  • Business School
  • Distance Learning

critical thinking evaluation examples

  • Indiana University, Bloomington
  • State University of New York at Oneonta

Critical thinking refers to the ability to analyze information objectively and make a reasoned judgment. It involves the evaluation of sources, such as data, facts, observable phenomena, and research findings.

Good critical thinkers can draw reasonable conclusions from a set of information, and discriminate between useful and less useful details to solve problems or make decisions. Employers prioritize the ability to think critically—find out why, plus see how you can demonstrate that you have this ability throughout the job application process. 

Why Do Employers Value Critical Thinking Skills?

Employers want job candidates who can evaluate a situation using logical thought and offer the best solution.

 Someone with critical thinking skills can be trusted to make decisions independently, and will not need constant handholding.

Hiring a critical thinker means that micromanaging won't be required. Critical thinking abilities are among the most sought-after skills in almost every industry and workplace. You can demonstrate critical thinking by using related keywords in your resume and cover letter, and during your interview.

Examples of Critical Thinking

The circumstances that demand critical thinking vary from industry to industry. Some examples include:

  • A triage nurse analyzes the cases at hand and decides the order by which the patients should be treated.
  • A plumber evaluates the materials that would best suit a particular job.
  • An attorney reviews evidence and devises a strategy to win a case or to decide whether to settle out of court.
  • A manager analyzes customer feedback forms and uses this information to develop a customer service training session for employees.

Promote Your Skills in Your Job Search

If critical thinking is a key phrase in the job listings you are applying for, be sure to emphasize your critical thinking skills throughout your job search.

Add Keywords to Your Resume

You can use critical thinking keywords (analytical, problem solving, creativity, etc.) in your resume. When describing your  work history , include top critical thinking skills that accurately describe you. You can also include them in your  resume summary , if you have one.

For example, your summary might read, “Marketing Associate with five years of experience in project management. Skilled in conducting thorough market research and competitor analysis to assess market trends and client needs, and to develop appropriate acquisition tactics.”

Mention Skills in Your Cover Letter

Include these critical thinking skills in your cover letter. In the body of your letter, mention one or two of these skills, and give specific examples of times when you have demonstrated them at work. Think about times when you had to analyze or evaluate materials to solve a problem.

Show the Interviewer Your Skills

You can use these skill words in an interview. Discuss a time when you were faced with a particular problem or challenge at work and explain how you applied critical thinking to solve it.

Some interviewers will give you a hypothetical scenario or problem, and ask you to use critical thinking skills to solve it. In this case, explain your thought process thoroughly to the interviewer. He or she is typically more focused on how you arrive at your solution rather than the solution itself. The interviewer wants to see you analyze and evaluate (key parts of critical thinking) the given scenario or problem.

Of course, each job will require different skills and experiences, so make sure you read the job description carefully and focus on the skills listed by the employer.

Top Critical Thinking Skills

Keep these in-demand critical thinking skills in mind as you update your resume and write your cover letter. As you've seen, you can also emphasize them at other points throughout the application process, such as your interview. 

Part of critical thinking is the ability to carefully examine something, whether it is a problem, a set of data, or a text. People with  analytical skills  can examine information, understand what it means, and properly explain to others the implications of that information.

  • Asking Thoughtful Questions
  • Data Analysis
  • Interpretation
  • Questioning Evidence
  • Recognizing Patterns


Often, you will need to share your conclusions with your employers or with a group of colleagues. You need to be able to  communicate with others  to share your ideas effectively. You might also need to engage in critical thinking in a group. In this case, you will need to work with others and communicate effectively to figure out solutions to complex problems.

  • Active Listening
  • Collaboration
  • Explanation
  • Interpersonal
  • Presentation
  • Verbal Communication
  • Written Communication

Critical thinking often involves creativity and innovation. You might need to spot patterns in the information you are looking at or come up with a solution that no one else has thought of before. All of this involves a creative eye that can take a different approach from all other approaches.

  • Flexibility
  • Conceptualization
  • Imagination
  • Drawing Connections
  • Synthesizing


To think critically, you need to be able to put aside any assumptions or judgments and merely analyze the information you receive. You need to be objective, evaluating ideas without bias.

  • Objectivity
  • Observation

Problem Solving

Problem-solving is another critical thinking skill that involves analyzing a problem, generating and implementing a solution, and assessing the success of the plan. Employers don’t simply want employees who can think about information critically. They also need to be able to come up with practical solutions.

  • Attention to Detail
  • Clarification
  • Decision Making
  • Groundedness
  • Identifying Patterns

More Critical Thinking Skills

  • Inductive Reasoning
  • Deductive Reasoning
  • Noticing Outliers
  • Adaptability
  • Emotional Intelligence
  • Brainstorming
  • Optimization
  • Restructuring
  • Integration
  • Strategic Planning
  • Project Management
  • Ongoing Improvement
  • Causal Relationships
  • Case Analysis
  • Diagnostics
  • SWOT Analysis
  • Business Intelligence
  • Quantitative Data Management
  • Qualitative Data Management
  • Risk Management
  • Scientific Method
  • Consumer Behavior

Key Takeaways

  • Demonstrate that you have critical thinking skills by adding relevant keywords to your resume.
  • Mention pertinent critical thinking skills in your cover letter, too, and include an example of a time when you demonstrated them at work.
  • Finally, highlight critical thinking skills during your interview. For instance, you might discuss a time when you were faced with a challenge at work and explain how you applied critical thinking skills to solve it.

University of Louisville. " What is Critical Thinking ."

American Management Association. " AMA Critical Skills Survey: Workers Need Higher Level Skills to Succeed in the 21st Century ."

  • How To Become an Effective Problem Solver
  • 2020-21 Common Application Essay Option 4—Solving a Problem
  • College Interview Tips: "Tell Me About a Challenge You Overcame"
  • The Horse Problem: A Math Challenge
  • Types of Medical School Interviews and What to Expect
  • What to Do When the Technology Fails in Class
  • A Guide to Business Letters Types
  • Landing Your First Teaching Job
  • How to Facilitate Learning and Critical Thinking
  • Problem Solving in Mathematics
  • Best Majors for Pre-med Students
  • Discover Ideas Through Brainstorming
  • What You Need to Know About the Executive Assessment
  • Finding a Job for ESL Learners
  • Finding a Job for ESL Learners: Interview Basics
  • Job Interview Questions and Answers

By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts.

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Instantly correct all language mistakes in your text

Be assured that you'll submit flawless writing. Upload your document to correct all your mistakes.


Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Scribbr Citation Checker New

The AI-powered Citation Checker helps you avoid common mistakes such as:

  • Missing commas and periods
  • Incorrect usage of “et al.”
  • Ampersands (&) in narrative citations
  • Missing reference entries

critical thinking evaluation examples

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing


  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

Prevent plagiarism. Run a free check.

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved January 8, 2024, from

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources.

Promoting and Assessing Critical Thinking

Critical thinking is a high priority outcome of higher education – critical thinking skills are crucial for independent thinking and problem solving in both our students’ professional and personal lives. But, what does it mean to be a critical thinker and how do we promote and assess it in our students? Critical thinking can be defined as being able to examine an issue by breaking it down, and evaluating it in a conscious manner, while providing arguments/evidence to support the evaluation. Below are some suggestions for promoting and assessing critical thinking in our students.

Thinking through inquiry

Asking questions and using the answers to understand the world around us is what drives critical thinking. In inquiry-based instruction, the teacher asks students leading questions to draw from them information, inferences, and predictions about a topic. Below are some example generic question stems that can serve as prompts to aid in generating critical thinking questions. Consider providing prompts such as these to students to facilitate their ability to also ask these questions of themselves and others. If we want students to generate good questions on their own, we need to teach them how to do so by providing them with the structure and guidance of example questions, whether in written form, or by our use of questions in the classroom.

Generic question stems

  • What are the strengths and weaknesses of …?
  • What is the difference between … and …?
  • Explain why/how …?
  • What would happen if …?
  • What is the nature of …?
  • Why is … happening?
  • What is a new example of …?
  • How could … be used to …?
  • What are the implications of …?
  • What is … analogous to?
  • What do we already know about …?
  • How does … affect …?
  • How does … tie in with what we have learned before?
  • What does … mean?
  • Why is … important?
  • How are … and … similar/different?
  • How does … apply to everyday life?
  • What is a counterarguement for …?
  • What is the best …and why?
  • What is a solution to the problem of …?
  • Compare … and … with regard to …?
  • What do you think causes …? Why?
  • Do you agree or disagree with this statement? What evidence is there to support your answer?
  • What is another way to look at …?

Critical thinking through writing

Another essential ingredient in critical thinking instruction is the use of writing. Writing converts students from passive to active learners and requires them to identify issues and formulate hypotheses and arguments. The act of writing requires students to focus and clarify their thoughts before putting them down on paper, hence taking them through the critical thinking process. Writing requires that students make important critical choices and ask themselves (Gocsik, 2002):

  • What information is most important?
  • What might be left out?
  • What is it that I think about this subject?
  • How did I arrive at what I think?
  • What are my assumptions? Are they valid?
  • How can I work with facts, observations, and so on, in order to convince others of what I think?
  • What do I not yet understand?

Consider providing the above questions to students so that they can evaluate their own writing as well. Some suggestions for critical thinking writing activities include:

  • Give students raw data and ask them to write an argument or analysis based on the data.
  • Have students explore and write about unfamiliar points of view or “what if” situations.
  • Think of a controversy in your field, and have the students write a dialogue between characters with different points of view.
  • Select important articles in your field and ask the students to write summaries or abstracts of them. Alternately, you could ask students to write an abstract of your lecture.
  • Develop a scenario that place students in realistic situations relevant to your discipline, where they must reach a decision to resolve a conflict.

See the Centre for Teaching Excellence (CTE) teaching tip “ Low-Stakes Writing Assignments ” for critical thinking writing assignments.

Critical thinking through group collaboration

Opportunities for group collaboration could include discussions, case studies, task-related group work, peer review, or debates. Group collaboration is effective for promoting critical thought because:

  • An effective team has the potential to produce better results than any individual,
  • Students are exposed to different perspectives while clarifying their own ideas,
  • Collaborating on a project or studying with a group for an exam generally stimulates interest and increases the understanding and knowledge of the topic.

See the CTE teaching tip “ Group Work in the Classroom: Types of Small Groups ” for suggestions for forming small groups in your classroom.

Assessing critical thinking skills

You can also use the students’ responses from the activities that promote critical thinking to assess whether they are, indeed, reaching your critical thinking goals. It is important to establish clear criteria for evaluating critical thinking. Even though many of us may be able to identify critical thinking when we see it, explicitly stated criteria help both students and teachers know the goal toward which they are working. An effective criterion measures which skills are present, to what extent, and which skills require further development. The following are characteristics of work that may demonstrate effective critical thinking:

  • Accurately and thoroughly interprets evidence, statements, graphics, questions, literary elements, etc.
  • Asks relevant questions.
  • Analyses and evaluates key information, and alternative points of view clearly and precisely.
  • Fair-mindedly examines beliefs, assumptions, and opinions and weighs them against facts.
  • Draws insightful, reasonable conclusions.
  • Justifies inferences and opinions.
  • Thoughtfully addresses and evaluates major alternative points of view.
  • Thoroughly explains assumptions and reasons.

It is also important to note that assessment is a tool that can be used throughout a course, not just at the end. It is more useful to assess students throughout a course, so you can see if criteria require further clarification and students can test out their understanding of your criteria and receive feedback. Also consider distributing your criteria with your assignments so that students receive guidance about your expectations. This will help them to reflect on their own work and improve the quality of their thinking and writing.

See the CTE teaching tip sheets “ Rubrics ” and “ Responding to Writing Assignments: Managing the Paper Load ” for more information on rubrics.

If you would like support applying these tips to your own teaching, CTE staff members are here to help.  View the  CTE Support  page to find the most relevant staff member to contact. 

  • Gocsik, K. (2002). Teaching Critical Thinking Skills. UTS Newsletter, 11(2):1-4
  • Facione, P.A. and Facione, N.C. (1994). Holistic Critical Thinking Scoring Rubric. Millbrae, CA: California Academic Press. (retrieved September 2003)
  • King, A. (1995). Inquiring minds really do want to know: using questioning to teach critical thinking. Teaching of Psychology, 22(1): 13-17
  • Wade, C. and Tavris, C. (1987). Psychology (1st ed.) New York: Harper. IN: Wade, C. (1995). Using Writing to Develop and Assess Critical Thinking. Teaching of Psychology, 22(1): 24-28.

teaching tips

Catalog search

Teaching tip categories.

  • Assessment and feedback
  • Blended Learning and Educational Technologies
  • Career Development
  • Course Design
  • Course Implementation
  • Inclusive Teaching and Learning
  • Learning activities
  • Support for Student Learning
  • Support for TAs
  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, performance assessment of critical thinking: conceptualization, design, and implementation.

critical thinking evaluation examples

  • 1 Lynch School of Education and Human Development, Boston College, Chestnut Hill, MA, United States
  • 2 Graduate School of Education, Stanford University, Stanford, CA, United States
  • 3 Department of Business and Economics Education, Johannes Gutenberg University, Mainz, Germany

Enhancing students’ critical thinking (CT) skills is an essential goal of higher education. This article presents a systematic approach to conceptualizing and measuring CT. CT generally comprises the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion. We further posit that CT also involves dealing with dilemmas involving ambiguity or conflicts among principles and contradictory information. We argue that performance assessment provides the most realistic—and most credible—approach to measuring CT. From this conceptualization and construct definition, we describe one possible framework for building performance assessments of CT with attention to extended performance tasks within the assessment system. The framework is a product of an ongoing, collaborative effort, the International Performance Assessment of Learning (iPAL). The framework comprises four main aspects: (1) The storyline describes a carefully curated version of a complex, real-world situation. (2) The challenge frames the task to be accomplished (3). A portfolio of documents in a range of formats is drawn from multiple sources chosen to have specific characteristics. (4) The scoring rubric comprises a set of scales each linked to a facet of the construct. We discuss a number of use cases, as well as the challenges that arise with the use and valid interpretation of performance assessments. The final section presents elements of the iPAL research program that involve various refinements and extensions of the assessment framework, a number of empirical studies, along with linkages to current work in online reading and information processing.


In their mission statements, most colleges declare that a principal goal is to develop students’ higher-order cognitive skills such as critical thinking (CT) and reasoning (e.g., Shavelson, 2010 ; Hyytinen et al., 2019 ). The importance of CT is echoed by business leaders ( Association of American Colleges and Universities [AACU], 2018 ), as well as by college faculty (for curricular analyses in Germany, see e.g., Zlatkin-Troitschanskaia et al., 2018 ). Indeed, in the 2019 administration of the Faculty Survey of Student Engagement (FSSE), 93% of faculty reported that they “very much” or “quite a bit” structure their courses to support student development with respect to thinking critically and analytically. In a listing of 21st century skills, CT was the most highly ranked among FSSE respondents ( Indiana University, 2019 ). Nevertheless, there is considerable evidence that many college students do not develop these skills to a satisfactory standard ( Arum and Roksa, 2011 ; Shavelson et al., 2019 ; Zlatkin-Troitschanskaia et al., 2019 ). This state of affairs represents a serious challenge to higher education – and to society at large.

In view of the importance of CT, as well as evidence of substantial variation in its development during college, its proper measurement is essential to tracking progress in skill development and to providing useful feedback to both teachers and learners. Feedback can help focus students’ attention on key skill areas in need of improvement, and provide insight to teachers on choices of pedagogical strategies and time allocation. Moreover, comparative studies at the program and institutional level can inform higher education leaders and policy makers.

The conceptualization and definition of CT presented here is closely related to models of information processing and online reasoning, the skills that are the focus of this special issue. These two skills are especially germane to the learning environments that college students experience today when much of their academic work is done online. Ideally, students should be capable of more than naïve Internet search, followed by copy-and-paste (e.g., McGrew et al., 2017 ); rather, for example, they should be able to critically evaluate both sources of evidence and the quality of the evidence itself in light of a given purpose ( Leu et al., 2020 ).

In this paper, we present a systematic approach to conceptualizing CT. From that conceptualization and construct definition, we present one possible framework for building performance assessments of CT with particular attention to extended performance tasks within the test environment. The penultimate section discusses some of the challenges that arise with the use and valid interpretation of performance assessment scores. We conclude the paper with a section on future perspectives in an emerging field of research – the iPAL program.

Conceptual Foundations, Definition and Measurement of Critical Thinking

In this section, we briefly review the concept of CT and its definition. In accordance with the principles of evidence-centered design (ECD; Mislevy et al., 2003 ), the conceptualization drives the measurement of the construct; that is, implementation of ECD directly links aspects of the assessment framework to specific facets of the construct. We then argue that performance assessments designed in accordance with such an assessment framework provide the most realistic—and most credible—approach to measuring CT. The section concludes with a sketch of an approach to CT measurement grounded in performance assessment .

Concept and Definition of Critical Thinking

Taxonomies of 21st century skills ( Pellegrino and Hilton, 2012 ) abound, and it is neither surprising that CT appears in most taxonomies of learning, nor that there are many different approaches to defining and operationalizing the construct of CT. There is, however, general agreement that CT is a multifaceted construct ( Liu et al., 2014 ). Liu et al. (2014) identified five key facets of CT: (i) evaluating evidence and the use of evidence; (ii) analyzing arguments; (iii) understanding implications and consequences; (iv) developing sound arguments; and (v) understanding causation and explanation.

There is empirical support for these facets from college faculty. A 2016–2017 survey conducted by the Higher Education Research Institute (HERI) at the University of California, Los Angeles found that a substantial majority of faculty respondents “frequently” encouraged students to: (i) evaluate the quality or reliability of the information they receive; (ii) recognize biases that affect their thinking; (iii) analyze multiple sources of information before coming to a conclusion; and (iv) support their opinions with a logical argument ( Stolzenberg et al., 2019 ).

There is general agreement that CT involves the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion (e.g., Erwin and Sebrell, 2003 ; Kosslyn and Nelson, 2017 ; Shavelson et al., 2018 ). We further suggest that CT includes dealing with dilemmas of ambiguity or conflict among principles and contradictory information ( Oser and Biedermann, 2020 ).

Importantly, Oser and Biedermann (2020) posit that CT can be manifested at three levels. The first level, Critical Analysis , is the most complex of the three levels. Critical Analysis requires both knowledge in a specific discipline (conceptual) and procedural analytical (deduction, inclusion, etc.) knowledge. The second level is Critical Reflection , which involves more generic skills “… necessary for every responsible member of a society” (p. 90). It is “a basic attitude that must be taken into consideration if (new) information is questioned to be true or false, reliable or not reliable, moral or immoral etc.” (p. 90). To engage in Critical Reflection, one needs not only apply analytic reasoning, but also adopt a reflective stance toward the political, social, and other consequences of choosing a course of action. It also involves analyzing the potential motives of various actors involved in the dilemma of interest. The third level, Critical Alertness , involves questioning one’s own or others’ thinking from a skeptical point of view.

Wheeler and Haertel (1993) categorized higher-order skills, such as CT, into two types: (i) when solving problems and making decisions in professional and everyday life, for instance, related to civic affairs and the environment; and (ii) in situations where various mental processes (e.g., comparing, evaluating, and justifying) are developed through formal instruction, usually in a discipline. Hence, in both settings, individuals must confront situations that typically involve a problematic event, contradictory information, and possibly conflicting principles. Indeed, there is an ongoing debate concerning whether CT should be evaluated using generic or discipline-based assessments ( Nagel et al., 2020 ). Whether CT skills are conceptualized as generic or discipline-specific has implications for how they are assessed and how they are incorporated into the classroom.

In the iPAL project, CT is characterized as a multifaceted construct that comprises conceptualizing, analyzing, drawing inferences or synthesizing information, evaluating claims, and applying the results of these reasoning processes to various purposes (e.g., solve a problem, decide on a course of action, find an answer to a given question or reach a conclusion) ( Shavelson et al., 2019 ). In the course of carrying out a CT task, an individual typically engages in activities such as specifying or clarifying a problem; deciding what information is relevant to the problem; evaluating the trustworthiness of information; avoiding judgmental errors based on “fast thinking”; avoiding biases and stereotypes; recognizing different perspectives and how they can reframe a situation; considering the consequences of alternative courses of actions; and communicating clearly and concisely decisions and actions. The order in which activities are carried out can vary among individuals and the processes can be non-linear and reciprocal.

In this article, we focus on generic CT skills. The importance of these skills derives not only from their utility in academic and professional settings, but also the many situations involving challenging moral and ethical issues – often framed in terms of conflicting principles and/or interests – to which individuals have to apply these skills ( Kegan, 1994 ; Tessier-Lavigne, 2020 ). Conflicts and dilemmas are ubiquitous in the contexts in which adults find themselves: work, family, civil society. Moreover, to remain viable in the global economic environment – one characterized by increased competition and advances in second generation artificial intelligence (AI) – today’s college students will need to continually develop and leverage their CT skills. Ideally, colleges offer a supportive environment in which students can develop and practice effective approaches to reasoning about and acting in learning, professional and everyday situations.

Measurement of Critical Thinking

Critical thinking is a multifaceted construct that poses many challenges to those who would develop relevant and valid assessments. For those interested in current approaches to the measurement of CT that are not the focus of this paper, consult Zlatkin-Troitschanskaia et al. (2018) .

In this paper, we have singled out performance assessment as it offers important advantages to measuring CT. Extant tests of CT typically employ response formats such as forced-choice or short-answer, and scenario-based tasks (for an overview, see Liu et al., 2014 ). They all suffer from moderate to severe construct underrepresentation; that is, they fail to capture important facets of the CT construct such as perspective taking and communication. High fidelity performance tasks are viewed as more authentic in that they provide a problem context and require responses that are more similar to what individuals confront in the real world than what is offered by traditional multiple-choice items ( Messick, 1994 ; Braun, 2019 ). This greater verisimilitude promises higher levels of construct representation and lower levels of construct-irrelevant variance. Such performance tasks have the capacity to measure facets of CT that are imperfectly assessed, if at all, using traditional assessments ( Lane and Stone, 2006 ; Braun, 2019 ; Shavelson et al., 2019 ). However, these assertions must be empirically validated, and the measures should be subjected to psychometric analyses. Evidence of the reliability, validity, and interpretative challenges of performance assessment (PA) are extensively detailed in Davey et al. (2015) .

We adopt the following definition of performance assessment:

A performance assessment (sometimes called a work sample when assessing job performance) … is an activity or set of activities that requires test takers, either individually or in groups, to generate products or performances in response to a complex, most often real-world task. These products and performances provide observable evidence bearing on test takers’ knowledge, skills, and abilities—their competencies—in completing the assessment ( Davey et al., 2015 , p. 10).

A performance assessment typically includes an extended performance task and short constructed-response and selected-response (i.e., multiple-choice) tasks (for examples, see Zlatkin-Troitschanskaia and Shavelson, 2019 ). In this paper, we refer to both individual performance- and constructed-response tasks as performance tasks (PT) (For an example, see Table 1 in section “iPAL Assessment Framework”).

Table 1. The iPAL assessment framework.

An Approach to Performance Assessment of Critical Thinking: The iPAL Program

The approach to CT presented here is the result of ongoing work undertaken by the International Performance Assessment of Learning collaborative (iPAL 1 ). iPAL is an international consortium of volunteers, primarily from academia, who have come together to address the dearth in higher education of research and practice in measuring CT with performance tasks ( Shavelson et al., 2018 ). In this section, we present iPAL’s assessment framework as the basis of measuring CT, with examples along the way.

iPAL Background

The iPAL assessment framework builds on the Council of Aid to Education’s Collegiate Learning Assessment (CLA). The CLA was designed to measure cross-disciplinary, generic competencies, such as CT, analytic reasoning, problem solving, and written communication ( Klein et al., 2007 ; Shavelson, 2010 ). Ideally, each PA contained an extended PT (e.g., examining a range of evidential materials related to the crash of an aircraft) and two short PT’s: one in which students either critique an argument or provide a solution in response to a real-world societal issue.

Motivated by considerations of adequate reliability, in 2012, the CLA was later modified to create the CLA+. The CLA+ includes two subtests: a PT and a 25-item Selected Response Question (SRQ) section. The PT presents a document or problem statement and an assignment based on that document which elicits an open-ended response. The CLA+ added the SRQ section (which is not linked substantively to the PT scenario) to increase the number of student responses to obtain more reliable estimates of performance at the student-level than could be achieved with a single PT ( Zahner, 2013 ; Davey et al., 2015 ).

iPAL Assessment Framework

Methodological foundations.

The iPAL framework evolved from the Collegiate Learning Assessment developed by Klein et al. (2007) . It was also informed by the results from the AHELO pilot study ( Organisation for Economic Co-operation and Development [OECD], 2012 , 2013 ), as well as the KoKoHs research program in Germany (for an overview see, Zlatkin-Troitschanskaia et al., 2017 , 2020 ). The ongoing refinement of the iPAL framework has been guided in part by the principles of Evidence Centered Design (ECD) ( Mislevy et al., 2003 ; Mislevy and Haertel, 2006 ; Haertel and Fujii, 2017 ).

In educational measurement, an assessment framework plays a critical intermediary role between the theoretical formulation of the construct and the development of the assessment instrument containing tasks (or items) intended to elicit evidence with respect to that construct ( Mislevy et al., 2003 ). Builders of the assessment framework draw on the construct theory and operationalize it in a way that provides explicit guidance to PT’s developers. Thus, the framework should reflect the relevant facets of the construct, where relevance is determined by substantive theory or an appropriate alternative such as behavioral samples from real-world situations of interest (criterion-sampling; McClelland, 1973 ), as well as the intended use(s) (for an example, see Shavelson et al., 2019 ). By following the requirements and guidelines embodied in the framework, instrument developers strengthen the claim of construct validity for the instrument ( Messick, 1994 ).

An assessment framework can be specified at different levels of granularity: an assessment battery (“omnibus” assessment, for an example see below), a single performance task, or a specific component of an assessment ( Shavelson, 2010 ; Davey et al., 2015 ). In the iPAL program, a performance assessment comprises one or more extended performance tasks and additional selected-response and short constructed-response items. The focus of the framework specified below is on a single PT intended to elicit evidence with respect to some facets of CT, such as the evaluation of the trustworthiness of the documents provided and the capacity to address conflicts of principles.

From the ECD perspective, an assessment is an instrument for generating information to support an evidentiary argument and, therefore, the intended inferences (claims) must guide each stage of the design process. The construct of interest is operationalized through the Student Model , which represents the target knowledge, skills, and abilities, as well as the relationships among them. The student model should also make explicit the assumptions regarding student competencies in foundational skills or content knowledge. The Task Model specifies the features of the problems or items posed to the respondent, with the goal of eliciting the evidence desired. The assessment framework also describes the collection of task models comprising the instrument, with considerations of construct validity, various psychometric characteristics (e.g., reliability) and practical constraints (e.g., testing time and cost). The student model provides grounds for evidence of validity, especially cognitive validity; namely, that the students are thinking critically in responding to the task(s).

In the present context, the target construct (CT) is the competence of individuals to think critically, which entails solving complex, real-world problems, and clearly communicating their conclusions or recommendations for action based on trustworthy, relevant and unbiased information. The situations, drawn from actual events, are challenging and may arise in many possible settings. In contrast to more reductionist approaches to assessment development, the iPAL approach and framework rests on the assumption that properly addressing these situational demands requires the application of a constellation of CT skills appropriate to the particular task presented (e.g., Shavelson, 2010 , 2013 ). For a PT, the assessment framework must also specify the rubric by which the responses will be evaluated. The rubric must be properly linked to the target construct so that the resulting score profile constitutes evidence that is both relevant and interpretable in terms of the student model (for an example, see Zlatkin-Troitschanskaia et al., 2019 ).

iPAL Task Framework

The iPAL ‘omnibus’ framework comprises four main aspects: A storyline , a challenge , a document library , and a scoring rubric . Table 1 displays these aspects, brief descriptions of each, and the corresponding examples drawn from an iPAL performance assessment (Version adapted from original in Hyytinen and Toom, 2019 ). Storylines are drawn from various domains; for example, the worlds of business, public policy, civics, medicine, and family. They often involve moral and/or ethical considerations. Deriving an appropriate storyline from a real-world situation requires careful consideration of which features are to be kept in toto , which adapted for purposes of the assessment, and which to be discarded. Framing the challenge demands care in wording so that there is minimal ambiguity in what is required of the respondent. The difficulty of the challenge depends, in large part, on the nature and extent of the information provided in the document library , the amount of scaffolding included, as well as the scope of the required response. The amount of information and the scope of the challenge should be commensurate with the amount of time available. As is evident from the table, the characteristics of the documents in the library are intended to elicit responses related to facets of CT. For example, with regard to bias, the information provided is intended to play to judgmental errors due to fast thinking and/or motivational reasoning. Ideally, the situation should accommodate multiple solutions of varying degrees of merit.

The dimensions of the scoring rubric are derived from the Task Model and Student Model ( Mislevy et al., 2003 ) and signal which features are to be extracted from the response and indicate how they are to be evaluated. There should be a direct link between the evaluation of the evidence and the claims that are made with respect to the key features of the task model and student model . More specifically, the task model specifies the various manipulations embodied in the PA and so informs scoring, while the student model specifies the capacities students employ in more or less effectively responding to the tasks. The score scales for each of the five facets of CT (see section “Concept and Definition of Critical Thinking”) can be specified using appropriate behavioral anchors (for examples, see Zlatkin-Troitschanskaia and Shavelson, 2019 ). Of particular importance is the evaluation of the response with respect to the last dimension of the scoring rubric; namely, the overall coherence and persuasiveness of the argument, building on the explicit or implicit characteristics related to the first five dimensions. The scoring process must be monitored carefully to ensure that (trained) raters are judging each response based on the same types of features and evaluation criteria ( Braun, 2019 ) as indicated by interrater agreement coefficients.

The scoring rubric of the iPAL omnibus framework can be modified for specific tasks ( Lane and Stone, 2006 ). This generic rubric helps ensure consistency across rubrics for different storylines. For example, Zlatkin-Troitschanskaia et al. (2019 , p. 473) used the following scoring scheme:

Based on our construct definition of CT and its four dimensions: (D1-Info) recognizing and evaluating information, (D2-Decision) recognizing and evaluating arguments and making decisions, (D3-Conseq) recognizing and evaluating the consequences of decisions, and (D4-Writing), we developed a corresponding analytic dimensional scoring … The students’ performance is evaluated along the four dimensions, which in turn are subdivided into a total of 23 indicators as (sub)categories of CT … For each dimension, we sought detailed evidence in students’ responses for the indicators and scored them on a six-point Likert-type scale. In order to reduce judgment distortions, an elaborate procedure of ‘behaviorally anchored rating scales’ (Smith and Kendall, 1963) was applied by assigning concrete behavioral expectations to certain scale points (Bernardin et al., 1976). To this end, we defined the scale levels by short descriptions of typical behavior and anchored them with concrete examples. … We trained four raters in 1 day using a specially developed training course to evaluate students’ performance along the 23 indicators clustered into four dimensions (for a description of the rater training, see Klotzer, 2018).

Shavelson et al. (2019) examined the interrater agreement of the scoring scheme developed by Zlatkin-Troitschanskaia et al. (2019) and “found that with 23 items and 2 raters the generalizability (“reliability”) coefficient for total scores to be 0.74 (with 4 raters, 0.84)” ( Shavelson et al., 2019 , p. 15). In the study by Zlatkin-Troitschanskaia et al. (2019 , p. 478) three score profiles were identified (low-, middle-, and high-performer) for students. Proper interpretation of such profiles requires care. For example, there may be multiple possible explanations for low scores such as poor CT skills, a lack of a disposition to engage with the challenge, or the two attributes jointly. These alternative explanations for student performance can potentially pose a threat to the evidentiary argument. In this case, auxiliary information may be available to aid in resolving the ambiguity. For example, student responses to selected- and short-constructed-response items in the PA can provide relevant information about the levels of the different skills possessed by the student. When sufficient data are available, the scores can be modeled statistically and/or qualitatively in such a way as to bring them to bear on the technical quality or interpretability of the claims of the assessment: reliability, validity, and utility evidence ( Davey et al., 2015 ; Zlatkin-Troitschanskaia et al., 2019 ). These kinds of concerns are less critical when PT’s are used in classroom settings. The instructor can draw on other sources of evidence, including direct discussion with the student.

Use of iPAL Performance Assessments in Educational Practice: Evidence From Preliminary Validation Studies

The assessment framework described here supports the development of a PT in a general setting. Many modifications are possible and, indeed, desirable. If the PT is to be more deeply embedded in a certain discipline (e.g., economics, law, or medicine), for example, then the framework must specify characteristics of the narrative and the complementary documents as to the breadth and depth of disciplinary knowledge that is represented.

At present, preliminary field trials employing the omnibus framework (i.e., a full set of documents) indicated that 60 min was generally an inadequate amount of time for students to engage with the full set of complementary documents and to craft a complete response to the challenge (for an example, see Shavelson et al., 2019 ). Accordingly, it would be helpful to develop modified frameworks for PT’s that require substantially less time. For an example, see a short performance assessment of civic online reasoning, requiring response times from 10 to 50 min ( Wineburg et al., 2016 ). Such assessment frameworks could be derived from the omnibus framework by focusing on a reduced number of facets of CT, and specifying the characteristics of the complementary documents to be included – or, perhaps, choices among sets of documents. In principle, one could build a ‘family’ of PT’s, each using the same (or nearly the same) storyline and a subset of the full collection of complementary documents.

Paul and Elder (2007) argue that the goal of CT assessments should be to provide faculty with important information about how well their instruction supports the development of students’ CT. In that spirit, the full family of PT’s could represent all facets of the construct while affording instructors and students more specific insights on strengths and weaknesses with respect to particular facets of CT. Moreover, the framework should be expanded to include the design of a set of short answer and/or multiple choice items to accompany the PT. Ideally, these additional items would be based on the same narrative as the PT to collect more nuanced information on students’ precursor skills such as reading comprehension, while enhancing the overall reliability of the assessment. Areas where students are under-prepared could be addressed before, or even in parallel with the development of the focal CT skills. The parallel approach follows the co-requisite model of developmental education. In other settings (e.g., for summative assessment), these complementary items would be administered after the PT to augment the evidence in relation to the various claims. The full PT taking 90 min or more could serve as a capstone assessment.

As we transition from simply delivering paper-based assessments by computer to taking full advantage of the affordances of a digital platform, we should learn from the hard-won lessons of the past so that we can make swifter progress with fewer missteps. In that regard, we must take validity as the touchstone – assessment design, development and deployment must all be tightly linked to the operational definition of the CT construct. Considerations of reliability and practicality come into play with various use cases that highlight different purposes for the assessment (for future perspectives, see next section).

The iPAL assessment framework represents a feasible compromise between commercial, standardized assessments of CT (e.g., Liu et al., 2014 ), on the one hand, and, on the other, freedom for individual faculty to develop assessment tasks according to idiosyncratic models. It imposes a degree of standardization on both task development and scoring, while still allowing some flexibility for faculty to tailor the assessment to meet their unique needs. In so doing, it addresses a key weakness of the AAC&U’s VALUE initiative 2 (retrieved 5/7/2020) that has achieved wide acceptance among United States colleges.

The VALUE initiative has produced generic scoring rubrics for 15 domains including CT, problem-solving and written communication. A rubric for a particular skill domain (e.g., critical thinking) has five to six dimensions with four ordered performance levels for each dimension (1 = lowest, 4 = highest). The performance levels are accompanied by language that is intended to clearly differentiate among levels. 3 Faculty are asked to submit student work products from a senior level course that is intended to yield evidence with respect to student learning outcomes in a particular domain and that, they believe, can elicit performances at the highest level. The collection of work products is then graded by faculty from other institutions who have been trained to apply the rubrics.

A principal difficulty is that there is neither a common framework to guide the design of the challenge, nor any control on task complexity and difficulty. Consequently, there is substantial heterogeneity in the quality and evidential value of the submitted responses. This also causes difficulties with task scoring and inter-rater reliability. Shavelson et al. (2009) discuss some of the problems arising with non-standardized collections of student work.

In this context, one advantage of the iPAL framework is that it can provide valuable guidance and an explicit structure for faculty in developing performance tasks for both instruction and formative assessment. When faculty design assessments, their focus is typically on content coverage rather than other potentially important characteristics, such as the degree of construct representation and the adequacy of their scoring procedures ( Braun, 2019 ).

Concluding Reflections

Challenges to interpretation and implementation.

Performance tasks such as those generated by iPAL are attractive instruments for assessing CT skills (e.g., Shavelson, 2010 ; Shavelson et al., 2019 ). The attraction mainly rests on the assumption that elaborated PT’s are more authentic (direct) and more completely capture facets of the target construct (i.e., possess greater construct representation) than the widely used selected-response tests. However, as Messick (1994) noted authenticity is a “promissory note” that must be redeemed with empirical research. In practice, there are trade-offs among authenticity, construct validity, and psychometric quality such as reliability ( Davey et al., 2015 ).

One reason for Messick (1994) caution is that authenticity does not guarantee construct validity. The latter must be established by drawing on multiple sources of evidence ( American Educational Research Association et al., 2014 ). Following the ECD principles in designing and developing the PT, as well as the associated scoring rubrics, constitutes an important type of evidence. Further, as Leighton (2019) argues, response process data (“cognitive validity”) is needed to validate claims regarding the cognitive complexity of PT’s. Relevant data can be obtained through cognitive laboratory studies involving methods such as think aloud protocols or eye-tracking. Although time-consuming and expensive, such studies can yield not only evidence of validity, but also valuable information to guide refinements of the PT.

Going forward, iPAL PT’s must be subjected to validation studies as recommended in the Standards for Psychological and Educational Testing by American Educational Research Association et al. (2014) . With a particular focus on the criterion “relationships to other variables,” a framework should include assumptions about the theoretically expected relationships among the indicators assessed by the PT, as well as the indicators’ relationships to external variables such as intelligence or prior (task-relevant) knowledge.

Complementing the necessity of evaluating construct validity, there is the need to consider potential sources of construct-irrelevant variance (CIV). One pertains to student motivation, which is typically greater when the stakes are higher. If students are not motivated, then their performance is likely to be impacted by factors unrelated to their (construct-relevant) ability ( Lane and Stone, 2006 ; Braun et al., 2011 ; Shavelson, 2013 ). Differential motivation across groups can also bias comparisons. Student motivation might be enhanced if the PT is administered in the context of a course with the promise of generating useful feedback on students’ skill profiles.

Construct-irrelevant variance can also occur when students are not equally prepared for the format of the PT or fully appreciate the response requirements. This source of CIV could be alleviated by providing students with practice PT’s. Finally, the use of novel forms of documentation, such as those from the Internet, can potentially introduce CIV due to differential familiarity with forms of representation or contents. Interestingly, this suggests that there may be a conflict between enhancing construct representation and reducing CIV.

Another potential source of CIV is related to response evaluation. Even with training, human raters can vary in accuracy and usage of the full score range. In addition, raters may attend to features of responses that are unrelated to the target construct, such as the length of the students’ responses or the frequency of grammatical errors ( Lane and Stone, 2006 ). Some of these sources of variance could be addressed in an online environment, where word processing software could alert students to potential grammatical and spelling errors before they submit their final work product.

Performance tasks generally take longer to administer and are more costly than traditional assessments, making it more difficult to reliably measure student performance ( Messick, 1994 ; Davey et al., 2015 ). Indeed, it is well known that more than one performance task is needed to obtain high reliability ( Shavelson, 2013 ). This is due to both student-task interactions and variability in scoring. Sources of student-task interactions are differential familiarity with the topic ( Hyytinen and Toom, 2019 ) and differential motivation to engage with the task. The level of reliability required, however, depends on the context of use. For use in formative assessment as part of an instructional program, reliability can be lower than use for summative purposes. In the former case, other types of evidence are generally available to support interpretation and guide pedagogical decisions. Further studies are needed to obtain estimates of reliability in typical instructional settings.

With sufficient data, more sophisticated psychometric analyses become possible. One challenge is that the assumption of unidimensionality required for many psychometric models might be untenable for performance tasks ( Davey et al., 2015 ). Davey et al. (2015) provide the example of a mathematics assessment that requires students to demonstrate not only their mathematics skills but also their written communication skills. Although the iPAL framework does not explicitly address students’ reading comprehension and organization skills, students will likely need to call on these abilities to accomplish the task. Moreover, as the operational definition of CT makes evident, the student must not only deploy several skills in responding to the challenge of the PT, but also carry out component tasks in sequence. The former requirement strongly indicates the need for a multi-dimensional IRT model, while the latter suggests that the usual assumption of local item independence may well be problematic ( Lane and Stone, 2006 ). At the same time, the analytic scoring rubric should facilitate the use of latent class analysis to partition data from large groups into meaningful categories ( Zlatkin-Troitschanskaia et al., 2019 ).

Future Perspectives

Although the iPAL consortium has made substantial progress in the assessment of CT, much remains to be done. Further refinement of existing PT’s and their adaptation to different languages and cultures must continue. To this point, there are a number of examples: The refugee crisis PT (cited in Table 1 ) was translated and adapted from Finnish to US English and then to Colombian Spanish. A PT concerning kidney transplants was translated and adapted from German to US English. Finally, two PT’s based on ‘legacy admissions’ to US colleges were translated and adapted to Colombian Spanish.

With respect to data collection, there is a need for sufficient data to support psychometric analysis of student responses, especially the relationships among the different components of the scoring rubric, as this would inform both task development and response evaluation ( Zlatkin-Troitschanskaia et al., 2019 ). In addition, more intensive study of response processes through cognitive laboratories and the like are needed to strengthen the evidential argument for construct validity ( Leighton, 2019 ). We are currently conducting empirical studies, collecting data on both iPAL PT’s and other measures of CT. These studies will provide evidence of convergent and discriminant validity.

At the same time, efforts should be directed at further development to support different ways CT PT’s might be used—i.e., use cases—especially those that call for formative use of PT’s. Incorporating formative assessment into courses can plausibly be expected to improve students’ competency acquisition ( Zlatkin-Troitschanskaia et al., 2017 ). With suitable choices of storylines, appropriate combinations of (modified) PT’s, supplemented by short-answer and multiple-choice items, could be interwoven into ordinary classroom activities. The supplementary items may be completely separate from the PT’s (as is the case with the CLA+), loosely coupled with the PT’s (as in drawing on the same storyline), or tightly linked to the PT’s (as in requiring elaboration of certain components of the response to the PT).

As an alternative to such integration, stand-alone modules could be embedded in courses to yield evidence of students’ generic CT skills. Core curriculum courses or general education courses offer ideal settings for embedding performance assessments. If these assessments were administered to a representative sample of students in each cohort over their years in college, the results would yield important information on the development of CT skills at a population level. For another example, these PA’s could be used to assess the competence profiles of students entering Bachelor’s or graduate-level programs as a basis for more targeted instructional support.

Thus, in considering different use cases for the assessment of CT, it is evident that several modifications of the iPAL omnibus assessment framework are needed. As noted earlier, assessments built according to this framework are demanding with respect to the extensive preliminary work required by a task and the time required to properly complete it. Thus, it would be helpful to have modified versions of the framework, focusing on one or two facets of the CT construct and calling for a smaller number of supplementary documents. The challenge to the student should be suitably reduced.

Some members of the iPAL collaborative have developed PT’s that are embedded in disciplines such as engineering, law and education ( Crump et al., 2019 ; for teacher education examples, see Jeschke et al., 2019 ). These are proving to be of great interest to various stakeholders and further development is likely. Consequently, it is essential that an appropriate assessment framework be established and implemented. It is both a conceptual and an empirical question as to whether a single framework can guide development in different domains.

Performance Assessment in Online Learning Environment

Over the last 15 years, increasing amounts of time in both college and work are spent using computers and other electronic devices. This has led to formulation of models for the new literacies that attempt to capture some key characteristics of these activities. A prominent example is a model proposed by Leu et al. (2020) . The model frames online reading as a process of problem-based inquiry that calls on five practices to occur during online research and comprehension:

1. Reading to identify important questions,

2. Reading to locate information,

3. Reading to critically evaluate information,

4. Reading to synthesize online information, and

5. Reading and writing to communicate online information.

The parallels with the iPAL definition of CT are evident and suggest there may be benefits to closer links between these two lines of research. For example, a report by Leu et al. (2014) describes empirical studies comparing assessments of online reading using either open-ended or multiple-choice response formats.

The iPAL consortium has begun to take advantage of the affordances of the online environment (for examples, see Schmidt et al. and Nagel et al. in this special issue). Most obviously, Supplementary Materials can now include archival photographs, audio recordings, or videos. Additional tasks might include the online search for relevant documents, though this would add considerably to the time demands. This online search could occur within a simulated Internet environment, as is the case for the IEA’s ePIRLS assessment ( Mullis et al., 2017 ).

The prospect of having access to a wealth of materials that can add to task authenticity is exciting. Yet it can also add ambiguity and information overload. Increased authenticity, then, should be weighed against validity concerns and the time required to absorb the content in these materials. Modifications of the design framework and extensive empirical testing will be required to decide on appropriate trade-offs. A related possibility is to employ some of these materials in short-answer (or even selected-response) items that supplement the main PT. Response formats could include highlighting text or using a drag-and-drop menu to construct a response. Students’ responses could be automatically scored, thereby containing costs. With automated scoring, feedback to students and faculty, including suggestions for next steps in strengthening CT skills, could also be provided without adding to faculty workload. Therefore, taking advantage of the online environment to incorporate new types of supplementary documents should be a high priority and, perhaps, to introduce new response formats as well. Finally, further investigation of the overlap between this formulation of CT and the characterization of online reading promulgated by Leu et al. (2020) is a promising direction to pursue.

Data Availability Statement

All datasets generated for this study are included in the article/supplementary material.

Author Contributions

HB wrote the article. RS, OZ-T, and KB were involved in the preparation and revision of the article and co-wrote the manuscript. All authors contributed to the article and approved the submitted version.

This study was funded in part by the Spencer Foundation (Grant No. #201700123).

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.


We would like to thank all the researchers who have participated in the iPAL program.

  • ^
  • ^
  • ^ When test results are reported by means of substantively defined categories, the scoring is termed “criterion-referenced”. This is, in contrast to results, reported as percentiles; such scoring is termed “norm-referenced”.

American Educational Research Association, American Psychological Association, and National Council on Measurement in Education (2014). Standards for Educational and Psychological Testing. Washington, D.C: American Educational Research Association.

Google Scholar

Arum, R., and Roksa, J. (2011). Academically Adrift: Limited Learning on College Campuses. Chicago, IL: University of Chicago Press.

Association of American Colleges and Universities (n.d.). VALUE: What is value?. Available online at:: (accessed May 7, 2020).

Association of American Colleges and Universities [AACU] (2018). Fulfilling the American Dream: Liberal Education and the Future of Work. Available online at:: (accessed May 1, 2020).

Braun, H. (2019). Performance assessment and standardization in higher education: a problematic conjunction? Br. J. Educ. Psychol. 89, 429–440. doi: 10.1111/bjep.12274

PubMed Abstract | CrossRef Full Text | Google Scholar

Braun, H. I., Kirsch, I., and Yamoto, K. (2011). An experimental study of the effects of monetary incentives on performance on the 12th grade NAEP reading assessment. Teach. Coll. Rec. 113, 2309–2344.

Crump, N., Sepulveda, C., Fajardo, A., and Aguilera, A. (2019). Systematization of performance tests in critical thinking: an interdisciplinary construction experience. Rev. Estud. Educ. 2, 17–47.

Davey, T., Ferrara, S., Shavelson, R., Holland, P., Webb, N., and Wise, L. (2015). Psychometric Considerations for the Next Generation of Performance Assessment. Washington, DC: Center for K-12 Assessment & Performance Management, Educational Testing Service.

Erwin, T. D., and Sebrell, K. W. (2003). Assessment of critical thinking: ETS’s tasks in critical thinking. J. Gen. Educ. 52, 50–70. doi: 10.1353/jge.2003.0019

CrossRef Full Text | Google Scholar

Haertel, G. D., and Fujii, R. (2017). “Evidence-centered design and postsecondary assessment,” in Handbook on Measurement, Assessment, and Evaluation in Higher Education , 2nd Edn, eds C. Secolsky and D. B. Denison (Abingdon: Routledge), 313–339. doi: 10.4324/9781315709307-26

Hyytinen, H., and Toom, A. (2019). Developing a performance assessment task in the Finnish higher education context: conceptual and empirical insights. Br. J. Educ. Psychol. 89, 551–563. doi: 10.1111/bjep.12283

Hyytinen, H., Toom, A., and Shavelson, R. J. (2019). “Enhancing scientific thinking through the development of critical thinking in higher education,” in Redefining Scientific Thinking for Higher Education: Higher-Order Thinking, Evidence-Based Reasoning and Research Skills , eds M. Murtonen and K. Balloo (London: Palgrave MacMillan).

Indiana University (2019). FSSE 2019 Frequencies: FSSE 2019 Aggregate. Available online at:: (accessed May 1, 2020).

Jeschke, C., Kuhn, C., Lindmeier, A., Zlatkin-Troitschanskaia, O., Saas, H., and Heinze, A. (2019). Performance assessment to investigate the domain specificity of instructional skills among pre-service and in-service teachers of mathematics and economics. Br. J. Educ. Psychol. 89, 538–550. doi: 10.1111/bjep.12277

Kegan, R. (1994). In Over Our Heads: The Mental Demands of Modern Life. Cambridge, MA: Harvard University Press.

Klein, S., Benjamin, R., Shavelson, R., and Bolus, R. (2007). The collegiate learning assessment: facts and fantasies. Eval. Rev. 31, 415–439. doi: 10.1177/0193841x07303318

Kosslyn, S. M., and Nelson, B. (2017). Building the Intentional University: Minerva and the Future of Higher Education. Cambridge, MAL: The MIT Press.

Lane, S., and Stone, C. A. (2006). “Performance assessment,” in Educational Measurement , 4th Edn, ed. R. L. Brennan (Lanham, MA: Rowman & Littlefield Publishers), 387–432.

Leighton, J. P. (2019). The risk–return trade-off: performance assessments and cognitive validation of inferences. Br. J. Educ. Psychol. 89, 441–455. doi: 10.1111/bjep.12271

Leu, D. J., Kiili, C., Forzani, E., Zawilinski, L., McVerry, J. G., and O’Byrne, W. I. (2020). “The new literacies of online research and comprehension,” in The Concise Encyclopedia of Applied Linguistics , ed. C. A. Chapelle (Oxford: Wiley-Blackwell), 844–852.

Leu, D. J., Kulikowich, J. M., Kennedy, C., and Maykel, C. (2014). “The ORCA Project: designing technology-based assessments for online research,” in Paper Presented at the American Educational Research Annual Meeting , Philadelphia, PA.

Liu, O. L., Frankel, L., and Roohr, K. C. (2014). Assessing critical thinking in higher education: current state and directions for next-generation assessments. ETS Res. Rep. Ser. 1, 1–23. doi: 10.1002/ets2.12009

McClelland, D. C. (1973). Testing for competence rather than for “intelligence.”. Am. Psychol. 28, 1–14. doi: 10.1037/h0034092

McGrew, S., Ortega, T., Breakstone, J., and Wineburg, S. (2017). The challenge that’s bigger than fake news: civic reasoning in a social media environment. Am. Educ. 4, 4-9, 39.

Mejía, A., Mariño, J. P., and Molina, A. (2019). Incorporating perspective analysis into critical thinking performance assessments. Br. J. Educ. Psychol. 89, 456–467. doi: 10.1111/bjep.12297

Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educ. Res. 23, 13–23. doi: 10.3102/0013189x023002013

Mislevy, R. J., Almond, R. G., and Lukas, J. F. (2003). A brief introduction to evidence-centered design. ETS Res. Rep. Ser. 2003, i–29. doi: 10.1002/j.2333-8504.2003.tb01908.x

Mislevy, R. J., and Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educ. Meas. Issues Pract. 25, 6–20. doi: 10.1111/j.1745-3992.2006.00075.x

Mullis, I. V. S., Martin, M. O., Foy, P., and Hooper, M. (2017). ePIRLS 2016 International Results in Online Informational Reading. Available online at:: (accessed May 1, 2020).

Nagel, M.-T., Zlatkin-Troitschanskaia, O., Schmidt, S., and Beck, K. (2020). “Performance assessment of generic and domain-specific skills in higher education economics,” in Student Learning in German Higher Education , eds O. Zlatkin-Troitschanskaia, H. A. Pant, M. Toepper, and C. Lautenbach (Berlin: Springer), 281–299. doi: 10.1007/978-3-658-27886-1_14

Organisation for Economic Co-operation and Development [OECD] (2012). AHELO: Feasibility Study Report , Vol. 1. Paris: OECD. Design and implementation.

Organisation for Economic Co-operation and Development [OECD] (2013). AHELO: Feasibility Study Report , Vol. 2. Paris: OECD. Data analysis and national experiences.

Oser, F. K., and Biedermann, H. (2020). “A three-level model for critical thinking: critical alertness, critical reflection, and critical analysis,” in Frontiers and Advances in Positive Learning in the Age of Information (PLATO) , ed. O. Zlatkin-Troitschanskaia (Cham: Springer), 89–106. doi: 10.1007/978-3-030-26578-6_7

Paul, R., and Elder, L. (2007). Consequential validity: using assessment to drive instruction. Found. Crit. Think. 29, 31–40.

Pellegrino, J. W., and Hilton, M. L. (eds) (2012). Education for life and work: Developing Transferable Knowledge and Skills in the 21st Century. Washington DC: National Academies Press.

Shavelson, R. (2010). Measuring College Learning Responsibly: Accountability in a New Era. Redwood City, CA: Stanford University Press.

Shavelson, R. J. (2013). On an approach to testing and modeling competence. Educ. Psychol. 48, 73–86. doi: 10.1080/00461520.2013.779483

Shavelson, R. J., Zlatkin-Troitschanskaia, O., Beck, K., Schmidt, S., and Marino, J. P. (2019). Assessment of university students’ critical thinking: next generation performance assessment. Int. J. Test. 19, 337–362. doi: 10.1080/15305058.2018.1543309

Shavelson, R. J., Zlatkin-Troitschanskaia, O., and Marino, J. P. (2018). “International performance assessment of learning in higher education (iPAL): research and development,” in Assessment of Learning Outcomes in Higher Education: Cross-National Comparisons and Perspectives , eds O. Zlatkin-Troitschanskaia, M. Toepper, H. A. Pant, C. Lautenbach, and C. Kuhn (Berlin: Springer), 193–214. doi: 10.1007/978-3-319-74338-7_10

Shavelson, R. J., Klein, S., and Benjamin, R. (2009). The limitations of portfolios. Inside Higher Educ. Available online at:

Stolzenberg, E. B., Eagan, M. K., Zimmerman, H. B., Berdan Lozano, J., Cesar-Davis, N. M., Aragon, M. C., et al. (2019). Undergraduate Teaching Faculty: The HERI Faculty Survey 2016–2017. Los Angeles, CA: UCLA.

Tessier-Lavigne, M. (2020). Putting Ethics at the Heart of Innovation. Stanford, CA: Stanford Magazine.

Wheeler, P., and Haertel, G. D. (1993). Resource Handbook on Performance Assessment and Measurement: A Tool for Students, Practitioners, and Policymakers. Palm Coast, FL: Owl Press.

Wineburg, S., McGrew, S., Breakstone, J., and Ortega, T. (2016). Evaluating Information: The Cornerstone of Civic Online Reasoning. Executive Summary. Stanford, CA: Stanford History Education Group.

Zahner, D. (2013). Reliability and Validity–CLA+. Council for Aid to Education. Available online at:: .

Zlatkin-Troitschanskaia, O., and Shavelson, R. J. (2019). Performance assessment of student learning in higher education [Special issue]. Br. J. Educ. Psychol. 89, i–iv, 413–563.

Zlatkin-Troitschanskaia, O., Pant, H. A., Lautenbach, C., Molerov, D., Toepper, M., and Brückner, S. (2017). Modeling and Measuring Competencies in Higher Education: Approaches to Challenges in Higher Education Policy and Practice. Berlin: Springer VS.

Zlatkin-Troitschanskaia, O., Pant, H. A., Toepper, M., and Lautenbach, C. (eds) (2020). Student Learning in German Higher Education: Innovative Measurement Approaches and Research Results. Wiesbaden: Springer.

Zlatkin-Troitschanskaia, O., Shavelson, R. J., and Pant, H. A. (2018). “Assessment of learning outcomes in higher education: international comparisons and perspectives,” in Handbook on Measurement, Assessment, and Evaluation in Higher Education , 2nd Edn, eds C. Secolsky and D. B. Denison (Abingdon: Routledge), 686–697.

Zlatkin-Troitschanskaia, O., Shavelson, R. J., Schmidt, S., and Beck, K. (2019). On the complementarity of holistic and analytic approaches to performance assessment scoring. Br. J. Educ. Psychol. 89, 468–484. doi: 10.1111/bjep.12286

Keywords : critical thinking, performance assessment, assessment framework, scoring rubric, evidence-centered design, 21st century skills, higher education

Citation: Braun HI, Shavelson RJ, Zlatkin-Troitschanskaia O and Borowiec K (2020) Performance Assessment of Critical Thinking: Conceptualization, Design, and Implementation. Front. Educ. 5:156. doi: 10.3389/feduc.2020.00156

Received: 30 May 2020; Accepted: 04 August 2020; Published: 08 September 2020.

Reviewed by:

Copyright © 2020 Braun, Shavelson, Zlatkin-Troitschanskaia and Borowiec. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Henry I. Braun, [email protected]

This article is part of the Research Topic

Assessing Information Processing and Online Reasoning as a Prerequisite for Learning in Higher Education

critical thinking evaluation examples

Explore your training options in 10 minutes Get Started

  • Graduate Stories
  • Partner Spotlights
  • Bootcamp Prep
  • Bootcamp Admissions
  • University Bootcamps
  • Software Engineering
  • Web Development
  • Data Science
  • Tech Guides
  • Tech Resources
  • Career Advice
  • Online Learning
  • Internships
  • Apprenticeships
  • Tech Salaries
  • Associate Degree
  • Bachelor's Degree
  • Master's Degree
  • University Admissions
  • Best Schools
  • Certifications
  • Bootcamp Financing
  • Higher Ed Financing
  • Scholarships
  • Financial Aid
  • Best Coding Bootcamps
  • Best Online Bootcamps
  • Best Web Design Bootcamps
  • Best Data Science Bootcamps
  • Best Technology Sales Bootcamps
  • Best Data Analytics Bootcamps
  • Best Cybersecurity Bootcamps
  • Best Digital Marketing Bootcamps
  • Los Angeles
  • San Francisco
  • Browse All Locations
  • Digital Marketing
  • Machine Learning
  • See All Subjects
  • Bootcamps 101
  • Full-Stack Development
  • Career Changes
  • View all Career Discussions
  • Mobile App Development
  • Cybersecurity
  • Product Management
  • UX/UI Design
  • What is a Coding Bootcamp?
  • Are Coding Bootcamps Worth It?
  • How to Choose a Coding Bootcamp
  • Best Online Coding Bootcamps and Courses
  • Best Free Bootcamps and Coding Training
  • Coding Bootcamp vs. Community College
  • Coding Bootcamp vs. Self-Learning
  • Bootcamps vs. Certifications: Compared
  • What Is a Coding Bootcamp Job Guarantee?
  • How to Pay for Coding Bootcamp
  • Ultimate Guide to Coding Bootcamp Loans
  • Best Coding Bootcamp Scholarships and Grants
  • Education Stipends for Coding Bootcamps
  • Get Your Coding Bootcamp Sponsored by Your Employer
  • GI Bill and Coding Bootcamps
  • Tech Intevriews
  • Our Enterprise Solution
  • Connect With Us
  • Publication
  • Reskill America
  • Partner With Us

Career Karma

  • Resource Center
  • Coding Tools
  • Bachelor’s Degree
  • Master’s Degree

Best Critical Thinking Examples to Help You Improve Your Critical and Analytical Skills

Critical thinking has been studied since ancient times. Greek philosophers such as Aristotle and Plato provided us with good critical thinking examples and the foundations for this field. Socrates is widely regarded as one of the fathers of critical thinking and deductive reasoning, a valuable skill in a world plagued with fake news and overwhelming amounts of information.

However, what is critical thinking? How can we use it in everyday life? In this article, we will explain what critical thinking is and why it is important, provide tips for improving your critical thinking skills, and offer the best examples of critical thinking.

Find your bootcamp match

What is critical thinking.

Critical thinking is the ability to think clearly and logically about what to do or believe. To do so, you must establish logical connections between ideas, evaluate arguments, approach problems systematically, and reflect on your values and beliefs. Logical thinking and scientific thinking are types of thinking that depend on these skills.

Additionally, the critical thinking process involves challenging knowledge to discover the truth. It involves reviewing knowledge and information to make an informed decision. You can improve your critical thinking skills by becoming more adept at analyzing problems, identifying biases, practicing active listening and inductive reasoning, and avoiding emotional reasoning.

Where Is Critical Thinking Used?

  • Progressive education
  • Risk assessment
  • Programming
  • SAT standardized tests

Why Is Critical Thinking Important?

Critical thinking is important because it allows you to better synthesize, analyze and interpret information. Other critical thinking skills like problem-solving , observation, and communication, can help you advance in your career. All of these skills can enable you to understand yourself better and make better life decisions.

Many people believe they are critical thinkers. However, when drawing conclusions in real life most people rely on common sense and numerous fallacies. To avoid this, we must have critical thinking dispositions to gain more insight, learn to identify a weak argument, and make better decisions. Understanding critical thinking concepts is crucial if you want to understand your thoughts, emotions, or live a better life.

Real-World Examples of Critical Thinking

People live their lives based on the choices they make. As a result, they require critical thinking skills and a constructive approach to problem-solving to make their lives easier. For example, if you need to deliver to multiple locations, don’t just go there by chance.

To save time, determine which location is closest and devise an efficient pattern for the next locations you will need to visit. This is just one of many examples of critical thinking for the following section. Below are more critical thinking examples.

  • Self-evaluation of your actions
  • HR manager resolving conflict between staffs
  • A military officer working on tactical plans
  • Professor guiding students to fresh ideas with creative questioning
  • Student defending a master’s thesis
  • Basketball coach seeking out new tactics during a timeout
  • Writer organizing content ideas
  • Applicant preparing for a job interview
  • Using a disciplined process to look for a job
  • A detective using their observational ability to analyze a crime scene

10 Great Examples of Critical Thinking

Critical thinking example 1: self-evaluation of your actions.

Self-evaluation is essential for improving your overall performance. When you use reflective thinking or try to evaluate yourself, you analyze what went wrong and how you can improve in the future. You attempt to understand what happened and figure out what you need to change to get different results.

Many universities and schools use special questionnaires that test critical thinking abilities. For example, Cambridge, a school with over 20 years of experience in assessing critical thinking, uses a student self-assessment critical thinking questionnaire .

Critical Thinking Example 2: HR Manager Resolving Conflict Between Staffs

Employees have disagreements in every organization. In many cases, it is the HR manager who steps in to solve the problem. However, the HR manager must first listen to both sides, determine the source of the problem, assess the situation, and decide how to proceed. As a result, a soft skill such as problem-solving or management is essential for HR.

Critical Thinking Example 3: A Military Officer Working on Tactical Plans

A military officer working on tactical plans for extracting fellow soldiers in a dangerous military zone is another example. In this case, the military officer must find an effective way to get the soldiers out of the danger zone while minimizing casualties, which requires logical thinking.

Critical Thinking Example 4: Professor Guiding Students to Fresh Ideas With Creative Questioning

Creative questioning is an interesting process because it can promote critical thinking. By asking creative open-ended questions, the professor makes students think more deeply about a subject. Therefore, they need to discern what information to pick and share. Analysis of arguments is another way to foster analytical thinking among students.

Critical Thinking Example 5: Student Defending a Master’s Thesis

Writing a master’s thesis requires applying critical thinking. You seek and gather information, conduct research, perform calculations, analyze data, and draw conclusions. You also demonstrate what critical skills you used to create the thesis by explaining all of the steps and methodology you used in the research process.

Critical Thinking Example 6: Basketball Coach Seeking Out New Tactics During a Timeout

In some cases, if the match does not go well, the basketball coach may call a timeout to reassess the team’s strategy. During the timeout, a basketball coach looks for new tactics that reveal the vulnerabilities of the opposing team. The coach needs to find a way to assess the potential risks and provide a new strategy that will lead the team to victory.

Critical Thinking Example 7: Writer Organizing Content Ideas

When writing articles, writers must distinguish between good and bad information. They must also make the article flow. To accomplish this, writers must adhere to the core concept of writing format: title, introduction, body, and conclusion. This means that they have to choose certain information to insert in certain sections of the text.

Critical Thinking Example 8: Applicants Preparing for a Job Interview

If you apply for a job and go to the interview blindly, there is a high chance you will not be hired. It is preferable to arrive prepared and apply critical thinking to the interview. One tip for interview preparation is to ask yourself outcome-based questions about the job. Before going to the interview, practice answering questions and acting quickly.

Critical Thinking Example 9: Using a Disciplined Process to Look for a Job

It can be difficult to find a job. Some stats show that on average it takes 100 to 200 applications to get a job. To improve your chances, you should put your critical thinking cap on. Logical thinking can help you consider how you will approach employers, devote time to updating your resume, skills, and create an effective cover letter .

Critical Thinking Example 10: A Detective Using Their Observational Ability to Analyze a Crime Scene

As a police detective, you must have strong critical thinking skills as well as excellent observational abilities to analyze a crime scene. You need logical inquiry and deduction skills to analyze the evidence. A police detective must have probable cause to obtain a search warrant from a judge to search a suspect’s home, which is another example of critical thinking.

Pro Tips to Boost Your Critical Thinking Skills

  • Analyze and Break It Down. Before forming an opinion, conduct extensive research and analysis. Once you have enough information, then you can try to break down all that information and analyze what it means. It is a good idea to break the problem down into smaller pieces so that you can see the bigger picture.
  • Deal With Your Biases. Critical thinking requires constant work, as people have biases that they need to deal with throughout their lives. If a person is aware of their biases, they can be aware of their own thought process and make sure they’re not just thinking one way.
  • Seek Advice. Develop a strong sense of acquiring knowledge. This means seeking advice when you are not sure about what you know. If you don’t know something, ask someone that knows. The more information you have, the better conclusion you can draw. Deal with the fact that you are not always right.

What Should Be the Next Step in My Critical Thinking Learning Journey?

Your next step in your critical thinking learning journey should be to actively use it in your everyday life. In real life, people encounter many opportunities to solve problems. With critical and careful thinking, you can afford to lead a better life and make more accurate decisions.

Using analytical and objective reasoning are some of the intellectual virtues that critical thinking offers to get a better job. If you use it in self-evaluation you can become a better version of yourself.

Advancing this skill can improve your professional life, problem-solving, and improve in developing and executing solutions. If you want to have well-informed opinions and deal with your biases, advance your critical thinking skills.

Critical Thinking Examples FAQ

Yes, critical thinking is a skill. The interesting part is that critical thinking is a learned skill. If it can be learned then it can be taught. However, the problem is that in many cases an experienced instructor is needed to transfer the skill. It is also one of the 21st-century skills you need to add to your resume.

Developing your critical thinking skills is a gradual process that requires deliberate effort. Changing your thought patterns and practices is a long-term project that you should commit to for the rest of your life.

No, IQ tests don’t measure critical thinking. Intelligence and critical thinking are not the same. If you want to test your critical thinking ability, you need a specialized critical thinking test. One example is the Cornell critical thinking test .

The bandwagon fallacy is about creating an opinion based on what the majority thinks. If everyone says the same thing, then it must be true. The problem with this notion is that the opinion of the majority is not always valid or a real form of knowledge. To avoid the bandwagon fallacy, you need to have a critical thinking disposition.

About us: Career Karma is a platform designed to help job seekers find, research, and connect with job training programs to advance their careers. Learn about the CK publication .

Venus profile photo

"Career Karma entered my life when I needed it most and quickly helped me match with a bootcamp. Two months after graduating, I found my dream job that aligned with my values and goals in life!"

Venus, Software Engineer at Rockbot

What's Next?


Get matched with top bootcamps

Ask a question to our community, take our careers quiz.

Hristijan Josifovski

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

critical thinking evaluation examples

  • Skip to content
  • Skip to search
  • Staff portal (Inside the department)
  • Student portal
  • Key links for students

Other users

  • Forgot password


{{item.title}}, my essentials, ask for help, contact edconnect, directory a to z, how to guides, evaluation resource hub, evaluative thinking.

Evaluative thinking is a disciplined approach to inquiry and reflective practice that helps us make sound judgements using good evidence, as a matter of habit.

The following video discusses evaluative thinking. It runs for 3:34 minutes.

Evaluation Capacity Building - Evaluative Thinking

Video transcript

A form of critical thinking

Evaluation is a form of critical thinking that involves examining evidence to make a judgement.

Evaluative claims have two parts: a conclusion and an explanation.

For example:

  • xyz was great, because?
  • xyz is disappointing, because?
  • xyz is a good way to go in this situation, because?

Drawing conclusions based on intuition is not evaluation. Neither is personal opinion, speculation or conjecture.

Each of us makes evaluative judgements every day. Sometimes these are quick assessments that don't matter much, like what to order for lunch. At other times we need to slow down our thought processes, weighing up all the factors carefully and making our deliberation transparent to others.

A disciplined approach

Evaluating a strategic direction or project in a school draws on similar thinking processes and mental disciplines as assessing student performance or recruiting a new staff member.

When we engage in evaluative thinking, we seek to:

  • suspend judgement , considering alternative explanations and allowing new evidence to change our mind
  • question assumptions , particularly about the pathway of cause and effect
  • select and develop solutions that are informed by a strong evidence base and are responsive to our context and priorities
  • value the lessons we can learn from all our experiences ? disappointments as well as triumphs
  • wrestle with questions of impact and effectiveness, not just activity and implementation
  • maximise the value of existing data sources already available to us, mindful of their limitations
  • work to improve the strength of our evidence base as we go.

Cognitive bias

Evaluative thinking helps us navigate the cognitive biases that cloud our judgement.

Cognitive bias occurs when our analysis of a situation is compromised by 'mental shortcuts' or patterns of thinking that place undue emphasis on a particular perspective.

Confirmation bias is one type of cognitive bias can easily compromise an evaluation. This is where the evaluator is already leaning towards a particular conclusion before they see the data. Without realising it, they then pay more attention to data that supports this position.

Although we may not be able to free ourselves from our cognitive biases, being aware of them is a good first step. The mental disciplines of evaluative thinking can help manage these biases, and to keep our reasoning sharp and convincing.

Read more about cognitive bias.

Develop evaluative thinking

Working openly with colleagues helps to develop evaluative thinking in ourselves and others. Evaluative thinking sometimes comes naturally, but at other times it can feel a bit challenging - even threatening. If we want to develop evaluative thinking in others, we first need to model it ourselves.

A good way to strengthen evaluative practice in schools is to engage in evaluative thinking as a group: deliberately, transparently and in a supportive context. In this way people have the time and space to reflect on their thinking. This is particularly important if we are to identify or 'unlearn' bad habits that we may have fallen into.

For example, the simple act of being asked 'What makes you think that?' prompts us to explain how we formed our judgements, including the evidence we have considered as part of this.

The importance of modelling and collaborative practice in evaluation is highlighted in the Australian Institute for Teaching and School Leadership's (AITSL ) profile relating to leading improvement, innovation and change . This profile encourages school leaders to develop 'a culture of continuous improvement' and 'a culture of trust and collaboration, where change and innovation based on research and evidence can flourish'.

As part of doing this, the Leadership Profile highlights the value of 'evaluating outcomes and refining actions as change is implemented? taking account of the impact of change on others, providing opportunities for regular feedback'.

Keep reading

  • Disciplines of evaluative thinking
  • Professional learning
  • Teaching and learning
  • Building capacity

Business Unit:

  • Centre for Education Statistics and Evaluation

100+ Performance Evaluation Comments for Attitude, Training Ability, Critical Thinking

By Editorial Team on June 19, 2022 — 9 minutes to read

  • Performance Review Phrases for Training Skills Part 1
  • Performance Review Phrases for Attitude Part 2
  • Performance Review Phrases for Critical Thinking Part 3
  • Performance Review Examples: Additional Resources Part 4

Performance evaluation comments are a way to provide feedback to employees about their work performance. Evaluation comments can include suggestions for improvement, observations about how the employee performed in specific tasks, or congratulations on a job well done.

In this article you will find performance review examples for the following skills:

  • Training Skills The performance review evaluates how well one learns new skills and information through training programs, and applies that new knowledge in the job.
  • Attitude The performance review evaluates one’s attitude, work ethic, motivation, and engagement. It looks at aspects like enthusiasm, positivity, adaptability, and receptiveness to feedback.
  • Critical Thinking The performance review evaluates the ability to analyze issues objectively, troubleshoot problems logically, challenge assumptions constructively, and evaluate options rationally.

Each section below contains positive (“Meets or Exceeds Expectations”) and negative (“Below Expectations”) examples for the skills listed above. If you are looking for more performance review examples and phrases for different skills, check our main article here (click to open 2000+ Performance Review Phrases: The Complete List)

Part 1 Training Skills

Meets or exceeds expectations.

✓ We are impressed with his willingness to give instructions.

✓ He is willing to help his staff find suitable training sessions.

✓ He does not mind taking part in on the job training. He is a valued team member with his enthusiasm.

✓ He encourages all the team members to join in the necessary training sessions.

✓ He is responsible and makes sure that all members take part in the requisite training sessions.

✓ He is able to learn concepts quickly and adopt them into his performance.

✓ He enthusiastically helps his fellow trainees learn complicated lessons in the training sessions by his “can-do” attitude.

✓ He always finds opportunities to take part in specialized training sessions.

Below Expectations

✗ He does not exchange class concepts to with colleagues though she still joins the training sessions.

✗ He does not require his staff to take part in the training sessions.

✗ He appears to think that the training sessions are not important so he does not concentrate on them.

✗ He should not joke in the training classes. He should concentrate on the content of training lessons.

✗ He should take part in more training opportunities and concentrate on them.

✗ He should prepare before coming to training sessions, This will help him understand all knowledge that trainers are imparting.

✗ He does not know how to apply the knowledge learned in training sessions into his performance.

✗ He declares that training sessions are not necessary despite the many new challenges he and his team are facing.

Related: Training Ability: Performance Review Examples (1 – 5)

Related: 2000+ Performance Review Phrases: The Complete List

Part 2 Attitude

✓ He builds an atmosphere of trust within his team.

✓ His attitude is an asset to his management skills. He continually builds people up. He deals with difficult situations with incredible posture, and focuses on the positives.

✓ He refuses to be downhearted under incredible pressure. He stands firm and the situation does not affect his attitude. He has a unique mind-set that is highly appreciated.

✓ His attitude is always spot on.

✓ His cheerful attitude makes others feel good when he’s around.

✓ He accentuates the positives in most situations.

✓ He maintains a steady, positive attitude that helps. His employees feel everything is ok, even when it seemingly isn’t.

✓ He always has a positive attitude. He frequently has a smile on his face and you can tell he enjoys his job.

✓ He is cheerful person and people feel delighted to be around him. We are fortunate to have him on our team.

✓ He has an even demeanor through good times and bad, which helps others keep their “enthusiasm”, both positive and negative, in check.

✓ He has a soothing and calm personality. His attitude is one which should be emulated.

✓ He brings a level of enthusiasm to his job that few can. He is always one member of the team we go to when we need a lift.

✓ He is adept at maneuvering around any obstacles that are thrown at him. He finishes his work on time and with accuracy.

✓ He always smiles when he goes to work. He does not mind showing off his positive attitude to the team.

✓ He has never complained about his job or his colleagues.

✓ He always provides assistance when necessary, even beyond his normal fixed duty hours or at weekends.

✓ He begins each day revitalized and ready for any challenges he may face.

✓ He encourages his colleagues at work. He easily builds an atmosphere of trust within his team.

✓ His attitude is a valuable facet of his management skills. He continuously strengthens the connections between among his subordinates. He resolves difficult situations in an amazing manner.

✓ He is firm, determined and confident. His attitude to work is not diminished in any situation. His strong-mindedness is very much appreciated.

✓ He can deal with internal pressure excellently. He maintains the right attitude towards external relationships.

✓ Bill is such as cheerful person that everyone feels good with him around.

✓ He usually focuses his attention on the positives when dealing with problems.

✓ He has a positive attitude and he is willing to share his thoughts about relevant matters.

✓ He congratulates staff on jobs well done. He maintains a good and comfortable environment for his team.

✓ His positive attitude in his management role is his most important and effective skill. He is cooperative with others and tries to connect everybody together. He maintains an incredible posture and focuses on the positives when solving problems.

✓ He is always enthusiastic and helps motivate other team members.

✓ He is very level-headed and handles stressful situations with ease.

✓ He is quick to congratulate coworkers and builds trust within his team.

✗ His choice of language can be inappropriate. His level of voice is also inappropriate at times.

✗ He has a tendency to trigger problems between his coworkers.

✗ He excessively tries to talk about non-work related topics. He should keep his conversations work related.

✗ He has a tendency to be irked over minor things. His volatile nature makes his coworkers uneasy.

✗ He should concentrate more on the positive sides of his duties and his team.

✗ He has a tendency to be disruptive after difficult decisions have been announced.

✗ He fails to maintain his normally positive attitude in stressful situations and often resorts to inappropriate tone and language.

✗ His talents are very valuable to the company; nevertheless, his attitude to work is somewhat lacking.

✗ His demeanor can be unapproachable and this negatively impacts the morale of his team.

✗ He needs to control his temper and attitude, he needs to learn not to react too strongly towards negative situations.

✗ He can be the cause of conflicts between his coworkers.

✗ He has an overly sensitive and pessimistic personality. He is easily upset by problems or difficult situations. He should try to think more positively, focus on the good and avoid being overly concerned with his perceived negatives.

✗ He often has troubles with his coworkers when they work together. He is easily angered and argumentative with his colleagues.

✗ His argumentative attitude in conversations his colleagues can make them angry. He is clearly aware of this but he has made no attempt to change his demeanor when talking about non-work related topics. It is highly recommended that he improves his attitude.

✗ He is somewhat pessimistic and his moods affect other employees.

✗ He is careless and unmotivated, which ultimately slows the team down.

✗ He does not handle stressful situations well.

✗ He complains about coworkers too often. His complaints build resentment within his team.

✗ He is not approachable and is tough to work with.

✗ He continually talks negatively about projects or tasks assigned to him.

✗ He always has a negative attitude towards his work.

✗ His attitude is poor in general impacts his work.

✗ He talks negatively about other team members.

✗ He needs to focus on the positive aspects of his job and his team.

✗ He is constantly disrespects his manager.

✗ He often isolates himself and does not take constructive criticism well.

Click to open 2000+ Performance Review Phrases: The Complete List

Part 3 Critical Thinking

✓ He uses sharp ideas and critical thinking ability to solve issues quickly.

✓ He decides the most appropriate solution to an issue by considering the pros and cons of it very cautiously.

✓ He responds to any issue very quickly. He always finds solutions for problems on time.

✓ He has strong reasoning and critical-thinking skills that help him handle problems well.

✓ He is careful and always thoroughly considers everything before he submitting any idea.

✓ He always thinks twice before he meets any problem. He rarely faces difficulty when he is faced with extreme situations.

✓ He uses an arsenal of creative strategies to proactively solve a wide range of problems.

✓ He consistently generates outstanding solutions to the most demanding problems.

✓ He focuses on solutions to problems not the symptoms.

✓ He has powerful analytical skills that he applies to every stage of the problem-solving process.

✓ He establishes workable, prioritized, and highly effective problem-solving plans for each problem.

✓ He varies his problem-solving style to meet the nature and demands of the problem itself.

✓ He approaches all problems with confidence and the expectation that he will generate solid and innovative solutions.

✓ He actively seeks out problems that require the most creative thinking.

✓ He quickly analyzes the pros and cons of any solution before deciding what is the most effective way to resolve a problem.

✓ He can find timely solution to any problem quickly and effectively.

✗ He never considers potential changes in circumstances when making decisions.

✗ He rarely thoroughly thinks decisions through.

✗ He does not have the ability to implement a reasonable solution to an issue.

✗ His suggestions are never unique.

✗ He has a tendency to make decision without thinking through its potential consequences.

✗ He makes decisions hastily without thinking.

✗ He never considers the possible consequences of his decisions.

✗ He never thinks twice before making any decision.

✗ He often does not consider the situations that may occur while during performing his duties.

✗ His decision making process is too hurried.

✗ He employs problem-solving techniques that end up generating even more problems.

✗ He creates more problems than he solves.

✗ He overlooks or underestimates problems until they become major issues.

✗ He analyzes minor issues and lets larger problems fester and grow

✗ He decides on the solution before properly analyzing it.

✗ He comes up with solutions that are incorrect, insufficient, and invalid.

✗ He doesn’t analyze the situation and the potential affect of his solutions carefully before making a final decision.

✗ He is unable to find the most effective solutions for problems because his ideas are always mediocre.

Related: Critical Thinking: Performance Review Examples (1 – 5)

Learn more:

How to Give Performance Feedback? Techniques and Examples (Positive, Negative, STAR Feedback) How to Identify and Handle Employee Underperformance? (5 Steps) Effective Performance Management: How to Coach Employees?

See also: 142 Employee Performance Evaluation Examples (Initiative, Flexibility, Proactiveness)

More Performance Review Examples

  • 142 Employee Performance Evaluation Examples (Initiative, Flexibility, Proactiveness, Goal-Setting)
  • Employee Evaluation Example [Guide for Managers]
  • How to Write an Employee Evaluation [Examples]
  • Job Knowledge Performance Review Phrases (Examples)
  • How to Write an Effective Performance Review (Essential Steps)
  • 100 Performance Review Phrases for Job Knowledge, Judgment, Listening Skills
  • The Advantages of Textbooks and Digital Resources
  • Analysis Skills: Understanding Critical Thinking and Science Learning
  • Differentiated Instruction Strategies for Gifted Students
  • Backward Design Principles: Understanding Curriculum Design
  • Classroom Management
  • Behavior management techniques
  • Classroom rules
  • Classroom routines
  • Classroom organization
  • Assessment Strategies
  • Summative assessment techniques
  • Formative assessment techniques
  • Portfolio assessment
  • Performance-based assessment
  • Teaching Strategies
  • Active learning
  • Inquiry-based learning
  • Differentiated instruction
  • Project-based learning
  • o2c-library/governance/arc-organisation-reports/final%20report.pdf
  • Learning Theories
  • Behaviorism
  • Social Learning Theory
  • Cognitivism
  • Constructivism
  • Critical Thinking Skills
  • Analysis skills
  • Creative thinking skills
  • Problem-solving skills
  • Evaluation skills
  • Metacognition
  • Metacognitive strategies
  • Self-reflection and metacognition
  • Goal setting and metacognition
  • Teaching Methods and Techniques
  • Direct instruction methods
  • Indirect instruction methods
  • Lesson Planning Strategies
  • Lesson sequencing strategies
  • Unit planning strategies
  • Differentiated Instruction Strategies
  • Differentiated instruction for English language learners
  • Differentiated instruction for gifted students
  • Standards and Benchmarks
  • State science standards and benchmarks
  • National science standards and benchmarks
  • Curriculum Design
  • Course design and alignment
  • Backward design principles
  • Curriculum mapping
  • Instructional Materials
  • Textbooks and digital resources
  • Instructional software and apps
  • Engaging Activities and Games
  • Hands-on activities and experiments
  • Cooperative learning games
  • Learning Environment Design
  • Classroom technology integration
  • Classroom layout and design
  • Instructional Strategies
  • Collaborative learning strategies
  • Problem-based learning strategies
  • 9-12 Science Lesson Plans
  • Life science lesson plans for 9-12 learners
  • Earth science lesson plans for 9-12 learners
  • Physical science lesson plans for 9-12 learners
  • K-5 Science Lesson Plans
  • Earth science lesson plans for K-5 learners
  • Life science lesson plans for K-5 learners
  • Physical science lesson plans for K-5 learners
  • 6-8 Science Lesson Plans
  • Earth science lesson plans for 6-8 learners
  • Life science lesson plans for 6-8 learners
  • Physical science lesson plans for 6-8 learners
  • Science Learning
  • Evaluation Skills: A Comprehensive Overview

This article provides an overview of evaluation skills, including what they are, why they are important, and how to develop them. It is written for anyone interested in improving their science learning and critical thinking skills.

Evaluation Skills: A Comprehensive Overview

Evaluation skills are a vital part of any student's education and are essential for success in the modern world. From determining the quality of evidence to assessing the validity of an argument, evaluation skills are essential for critical thinking and problem-solving. In this comprehensive overview, we'll explore the importance of evaluation skills and how they can be developed and applied in all areas of life. From formulating hypotheses to understanding the different types of evidence and the criteria used to assess them, this article will provide an in-depth look at the evaluation process. We'll discuss why evaluation skills are so important, explore how they can be developed, and provide examples of how they can be used in everyday life.

For those looking for more guidance, Spires online BMAT tutors can provide expert advice on developing and applying evaluation skills. By the end of this article, readers will have a better understanding of the importance of evaluation skills, as well as a clear picture of how to develop and apply them in everyday life. Evaluation skills consist of a set of abilities that enable us to analyze information accurately and come to sound conclusions. These skills include critical thinking , problem-solving, data interpretation, and logical reasoning. They are important for science learning and critical thinking because they allow us to evaluate the validity of scientific theories and hypotheses. Developing evaluation skills requires practice and dedication.

It involves developing the ability to ask the right questions, observe carefully, think objectively, and draw accurate conclusions. To develop these skills, it is important to practice analyzing information from different sources and coming to sound conclusions. It is also important to practice questioning assumptions and challenging accepted beliefs. Additionally, it is helpful to read books and articles on science and critical thinking to better understand how to evaluate information accurately. Another important part of developing evaluation skills is understanding the importance of data accuracy and precision.

Data accuracy refers to how closely a measurement or estimate reflects the true value of a phenomenon, while data precision refers to how precise the measurement is. Understanding these concepts helps us evaluate data more accurately. Finally, developing evaluation skills requires honing our communication skills. Being able to effectively communicate our thoughts and opinions helps us share our ideas with others and develop our understanding of the material. Communicating clearly also helps us evaluate information more accurately by making sure that everyone is on the same page when discussing complex topics. In conclusion, evaluation skills are essential for effective learning in any subject.

How Can We Develop Evaluation Skills?

The importance of communication.

Communication allows us to share our ideas with others, which can then be discussed, debated, and further developed. Communication also allows us to develop a deeper understanding of the material by being able to ask questions and get feedback from peers or experts. In order to develop our communication skills, we need to learn how to express ourselves clearly and concisely. We must also learn how to actively listen to others and truly understand their perspectives. This will allow us to better engage in dialogue and form meaningful connections with our peers.

Why Are Evaluation Skills Important?

Understanding data accuracy & precision.

For example, if a measurement is off by 10% from the true value, then it has an accuracy of 90%. On the other hand, data precision is determined by the number of significant figures in a measurement. The more significant figures a measurement has, the more precise it is. In order to evaluate data accurately, it is important to understand how these two concepts interact. For example, if a measurement has high accuracy but low precision, then it may still be inaccurate because of the amount of error in the measurement.

Similarly, a measurement with high precision but low accuracy can still be inaccurate if the true value is significantly different from the measurement. By understanding the importance of both accuracy and precision in data evaluation, we can make more informed decisions about the validity of our conclusions. This is especially important when making decisions based on scientific evidence, as inaccurate or imprecise data can lead to incorrect conclusions and inaccurate results. In conclusion, evaluation skills are an essential part of science learning and critical thinking. Developing these skills requires practice and dedication, as well as an understanding of data accuracy and precision. Finally, effective communication is a key component of mastering these skills, as it allows us to share our ideas with others.

Gabrielle Lamonte

Gabrielle Lamonte

Passionate about science education and committed to making science accessible and engaging for all students with a Bachelor's degree in Science Education from the University of Birmingham.

New Articles

An Overview of Metacognitive Strategies

  • An Overview of Metacognitive Strategies

Learn about metacognitive strategies for science learning and how to apply them in your studies.

Social Learning Theory Explained

  • Social Learning Theory Explained

Learn about social learning theory and how it can be used to explain behavior. Find out how it is different from other types of learning.

Problem-solving Skills: A Comprehensive Overview

  • Problem-solving Skills: A Comprehensive Overview

Learn the fundamentals of problem-solving skills and how they can help you develop critical thinking skills within the Science Learning silo.

Understanding Cognitivism: A Learning Theory

  • Understanding Cognitivism: A Learning Theory

Discover what cognitivism is, how it works and why it's an important learning theory

Leave Reply

Your email address will not be published. Required fields are marked *

I agree that spam comments wont´t be published

Behavior Management Techniques

  • Behaviorism: A Comprehensive Overview
  • Summative Assessment Techniques: An Overview
  • Formative Assessment Techniques
  • Active Learning: A Comprehensive Overview
  • Inquiry-Based Learning: An Introduction to Teaching Strategies
  • Creative Thinking Skills
  • Constructivism: Exploring the Theory of Learning
  • Classroom Rules - A Comprehensive Overview
  • Exploring Portfolio Assessment: An Introduction
  • Differentiated Instruction: A Comprehensive Overview
  • Classroom Routines: A Comprehensive Overview
  • Effective Classroom Organization Strategies for Science Teaching
  • Project-Based Learning: An In-Depth Look
  • Performance-Based Assessment: A Comprehensive Overview

Understanding Direct Instruction Methods

  • State Science Standards and Benchmarks
  • Course Design and Alignment
  • Engaging Hands-on Activities and Experiments
  • Engaging Cooperative Learning Games

Integrating Technology into the Classroom

  • Understanding Classroom Layout and Design
  • Lesson Sequencing Strategies: A Comprehensive Overview
  • Instructional Software and Apps: A Comprehensive Overview
  • Understanding Curriculum Mapping
  • Collaborative Learning Strategies
  • Indirect Instruction Methods: A Comprehensive Overview
  • Understanding National Science Standards and Benchmarks
  • Exploring Problem-Based Learning Strategies
  • Unit Planning Strategies
  • Exploring Self-Reflection and Metacognition
  • Exploring Goal Setting and Metacognition
  • Life Science Lesson Plans for 9-12 Learners
  • Earth Science Lesson Plans for K-5 Learners
  • Differentiated Instruction for English Language Learners
  • Life Science Lesson Plans for K-5 Learners
  • Earth Science Lesson Plans for 6-8 Learners
  • Earth Science Lesson Plans for 9-12 Learners
  • Life Science Lesson Plans for 6-8 Learners
  • Physical Science Lesson Plans for 9-12 Learners
  • Physical Science Lesson Plans for K-5 Learners
  • Physical Science Lesson Plans for 6-8 Learners

Recent Posts

Understanding Direct Instruction Methods

Which cookies do you want to accept?

Critical Thinking Performance Review Phrases Examples

Related Evaluation Phrases

Critical Thinking Sample Phrases To Write A Performance Appraisal Feedback

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents


Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o’clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68–69; 1933: 91–92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot’s position, it must appear to project far out in front of the boat. Moreover, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69–70; 1933: 92–93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond lane from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses. As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009, 2021), others on the resulting judgment (Facione 1990a), and still others on responsiveness to reasons (Siegel 1988). Kuhn (2019) takes critical thinking to be more a dialogic practice of advancing and responding to arguments than an individual ability.

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in spacing in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the spacing of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016a) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Facione (1990a: 25) divides “affective dispositions” of critical thinking into approaches to life and living in general and approaches to specific issues, questions or problems. Adapting this distinction, one can usefully divide critical thinking dispositions into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking. In three studies, Haran, Ritov, & Mellers (2013) found that actively open-minded thinking, including “the tendency to weigh new evidence against a favored belief, to spend sufficient time on a problem before giving up, and to consider carefully the opinions of others in forming one’s own”, led study participants to acquire information and thus to make accurate estimations.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), Black (2012), and Blair (2021).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work. It is also helpful to be aware of the prevalence of “noise” (unwanted unsystematic variability of judgments), of how to detect noise (through a noise audit), and of how to reduce noise: make accuracy the goal, think statistically, break a process of arriving at a judgment into independent tasks, resist premature intuitions, in a group get independent judgments first, favour comparative judgments and scales (Kahneman, Sibony, & Sunstein 2021). It is helpful as well to be aware of the concept of “bounded rationality” in decision-making and of the related distinction between “satisficing” and optimizing (Simon 1956; Gigerenzer 2001).

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? In a comprehensive meta-analysis of experimental and quasi-experimental studies of strategies for teaching students to think critically, Abrami et al. (2015) found that dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), Bailin et al. (1999b), and Willingham (2019).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016a, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • –––, 2016b, Reason in the Balance: An Inquiry Approach to Critical Thinking , Indianapolis: Hackett, 2nd edition.
  • –––, 2021, “Inquiry: Teaching for Reasoned Judgment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 31–46. doi: 10.1163/9789004444591_003
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Blair, J. Anthony, 2021, Studies in Critical Thinking , Windsor, ON: Windsor Studies in Argumentation, 2nd edition. [Available online at]
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Boardman, Frank, Nancy M. Cavender, and Howard Kahane, 2018, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Boston: Cengage, 13th edition.
  • Browne, M. Neil and Stuart M. Keeley, 2018, Asking the Right Questions: A Guide to Critical Thinking , Hoboken, NJ: Pearson, 12th edition.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cleghorn, Paul. 2021. “Critical Thinking in the Elementary School: Practical Guidance for Building a Culture of Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessmen t, Leiden: Brill, pp. 150–167. doi: 10.1163/9789004444591_010
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at ; last accessed 2022 07 16.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at ; last accessed 2022 07 16.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at ; last accessed 2022 07 16.
  • ––– (coord.), 2018c, The CRITHINKEDU European Course on Critical Thinking Education for University Teachers: From Conception to Delivery , Vila Real: UTAD. Available at http:/; last accessed 2022 07 16.
  • Dominguez Caroline and Rita Payan-Carreira (eds.), 2019, Promoting Critical Thinking in European Higher Education Institutions: Towards an Educational Protocol , Vila Real: UTAD. Available at http:/; last accessed 2022 07 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”, Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at ; last accessed 2022 07 16.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Gigerenzer, Gerd, 2001, “The Adaptive Toolbox”, in Gerd Gigerenzer and Reinhard Selten (eds.), Bounded Rationality: The Adaptive Toolbox , Cambridge, MA: MIT Press, pp. 37–50.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Groarke, Leo A. and Christopher W. Tindale, 2012, Good Reasoning Matters! A Constructive Approach to Critical Thinking , Don Mills, ON: Oxford University Press, 5th edition.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at; last accessed 2022 07 16.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haran, Uriel, Ilana Ritov, and Barbara A. Mellers, 2013, “The Role of Actively Open-minded Thinking in Information Acquisition, Accuracy, and Calibration”, Judgment and Decision Making , 8(3): 188–201.
  • Hatcher, Donald and Kevin Possin, 2021, “Commentary: Thinking Critically about Critical Thinking Assessment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 298–322. doi: 10.1163/9789004444591_017
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Haynes, Ada and Barry Stein, 2021, “Observations from a Long-Term Effort to Assess and Improve Critical Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 231–254. doi: 10.1163/9789004444591_014
  • Hiner, Amanda L. 2021. “Equipping Students for Success in College and Beyond: Placing Critical Thinking Instruction at the Heart of a General Education Program”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 188–208. doi: 10.1163/9789004444591_012
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • –––, 2021, “Seven Philosophical Implications of Critical Thinking: Themes, Variations, Implications”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 9–30. doi: 10.1163/9789004444591_002
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kahneman, Daniel, Olivier Sibony, & Cass R. Sunstein, 2021, Noise: A Flaw in Human Judgment , New York: Little, Brown Spark.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • –––, 2019, “Critical Thinking as Discourse”, Human Development, 62 (3): 146–164. doi:10.1159/000500171
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • –––, 2003, Thinking in Education , Cambridge: Cambridge University Press, 2nd edition.
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Makaiau, Amber Strong, 2021, “The Good Thinker’s Tool Kit: How to Engage Critical Thinking and Reasoning in Secondary Education”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 168–187. doi: 10.1163/9789004444591_011
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Moore, Brooke Noel and Richard Parker, 2020, Critical Thinking , New York: McGraw-Hill, 13th edition.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Past papers available at; last accessed 2022 07 16.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at ; last accessed 2022 07 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2013c, “A Fatal Flaw in the Collegiate Learning Assessment Test”, Assessment Update , 25 (1): 8–12.
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • –––, 2020, “CAT Scan: A Critical Review of the Critical-Thinking Assessment Test”, Informal Logic , 40 (3): 489–508. [Available online at]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rear, David, 2019, “One Size Fits All? The Limitations of Standardised Assessment in Critical Thinking”, Assessment & Evaluation in Higher Education , 44(5): 664–675. doi: 10.1080/02602938.2018.1526255
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at ; last accessed 2022 07 16.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simon, Herbert A., 1956, “Rational Choice and the Structure of the Environment”, Psychological Review , 63(2): 129–138. doi: 10.1037/h0042769
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2018, Curriculum for the Compulsory School, Preschool Class and School-age Educare , Stockholm: Skolverket, revised 2018. Available at; last accessed 2022 07 15.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < >
  • Vincent-Lancrin, Stéphan, Carlos González-Sancho, Mathias Bouckaert, Federico de Luca, Meritxell Fernández-Barrerra, Gwénaël Jacotin, Joaquin Urgel, and Quentin Vidal, 2019, Fostering Students’ Creativity and Critical Thinking: What It Means in School. Educational Research and Innovation , Paris: OECD Publishing.
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Willingham, Daniel T., 2019, “How to Teach Critical Thinking”, Education: Future Frontiers , 1: 1–17. [Available online at]
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2023 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Christopher Dwyer Ph.D.

3 Activities to Enhance Your Evaluation in Critical Thinking

This set of exercises may help enhance critical thinking..

Posted July 9, 2021 | Reviewed by Devon Frye

Back in January, I posted a piece on this blog asking if any readers wanted to develop their critical thinking as a kind of New Year’s Resolution. Following some rather positive feedback, I added a second set of exercises on "analysis."

The third set of exercises in this sequence, regarding "evaluation," can be found below. If you’re interested in enhancing your critical thinking skills, please start with the first set of exercises and then the second, before jumping into this next set. Remember, when we are given opportunities to think about our thinking, we are engaging our metacognitive processes; and that’s a foundational part of critical thinking!

Evaluate arguments on a topic from articles online regarding a current social problem.

Exercise 1:

Find two examples of each type of evidence from the articles you’ve read and evaluate the credibility of the proposition. That is, is the source reliable? Can it be trusted? If not, why? Remember, personal experience and "common belief" statements are among the worst sources for credible information, so be on the lookout for those!

Assess the evidence based on the following factors:

  • Anecdotal Evidence/Personal Experience
  • Common Belief/Sense Statements
  • Expert Opinion
  • Statistical Evidence
  • Research Data

Exercise 2:

Of course, there is more to "evaluation" than an assessment of credibility. In critical thinking, we must also evaluate the relevance, logic, balance, and bias among the propositions of support and refutation. Think about the articles you read and try answering the following questions in light of your reading:

  • Is there any information presented that is irrelevant to the central claim that the article is trying to make?
  • Is the overall argument imbalanced in any way?
  • Is the overall argument biased in any way?
  • Does the argument exclude important arguments?
  • Are there hidden assumptions that need to be made more explicit?

Exercise 3:

Finally, consider whether the reasoning presented within these articles is strong enough, in light of your evaluations, to support the conclusion(s) drawn. Are there other reasonable conclusions possible? Again, consider the credibility, relevance, bias, balance, and logic of the information presented in the articles in the context of your consideration.

These activities are a great way to start working on the critical thinking skill of evaluation, as they help you to assess the strength and weaknesses of an argument through each proposition. Depending on how thorough your evaluation was, considering the credibility, relevance, logical strength and both balance and bias of the network of proposition within an argument should give you confidence when inferring a conclusion. In Exercise 4, we will delve further into the skill of inference.

Christopher Dwyer Ph.D.

Christopher Dwyer, Ph.D., is a lecturer at the Technological University of the Shannon in Athlone, Ireland.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Teletherapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

January 2024 magazine cover

Overcome burnout, your burdens, and that endless to-do list.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience
  • Utility Menu

University Logo

GA4 Tracking Code



Bok Center Logo

  • Evaluate and Critique a Model or Argument

Examples in the ABLConnect Database

  • The Spectrum Game
  • Simulated Investment in Genome Editing Technology
  • Learning through Case Construction

Evaluate and Critique a Model or Argument can be worked into a variety of activity types! Here are just a few examples:

Research : A research project can be expanded to include a peer-review step. A peer-review allows students to evaluate their classmate’s model and provides the author feedback on how well they defended their point. Critically, establish a set of rules and class norms so that students’ reviews are constructive rather than mean. If you think this may be a problem, you may also have students submit reviews to you so that you can rephrase feedback before handing it back to the author.

Think-Pair-Share : After introducing a new model to the class in lecture, take a pause to allow students to evaluate the model on their own. Give a few minutes for each student to think critically on their own and jot down a few strengths/weaknesses of the new model. Then encourage students to speak to their neighbors to compare their evaluations. Finally, bring the class back together to have a larger class discussion.

There are also ways to focus on Evaluate and Critique a Model or Argument in unexpected activity types.

Statement Corrections : Have a series of statements proposing different models/arguments based on the same set of primary sources. Have students compare each different model, pick the one they feel is the most accurate, then correct the other statements to reflect their choice. If done in independent groups, bring the class back together to see if groups had differing opinions and discuss what aspects of the model convinced them. End by having the class settle on a single model that best represents the primary sources.

  • Collaborate
  • Compare the Strengths and Weaknesses of Different Models
  • Defend a Model or Argument
  • Develop Communication Skills
  • Develop Subject Specific Intuition
  • Interpret Primary Sources to Propose a Model
  • Learn Foundational Knowledge
  • Make Real World Connections to Course Material
  • Reflect on the Learning Process

The Peak Performance Center

The Peak Performance Center

The pursuit of performance excellence, critical thinking.

Critical Thinking header

critical thinking

Critical thinking refers to the process of actively analyzing, assessing, synthesizing, evaluating and reflecting on information gathered from observation, experience, or communication. It is thinking in a clear, logical, reasoned, and reflective manner to solve problems or make decisions. Basically, critical thinking is taking a hard look at something to understand what it really means.

Critical Thinkers

Critical thinkers do not simply accept all ideas, theories, and conclusions as facts. They have a mindset of questioning ideas and conclusions. They make reasoned judgments that are logical and well thought out by assessing the evidence that supports a specific theory or conclusion.

When presented with a new piece of new information, critical thinkers may ask questions such as;

“What information supports that?”

“How was this information obtained?”

“Who obtained the information?”

“How do we know the information is valid?”

“Why is it that way?”

“What makes it do that?”

“How do we know that?”

“Are there other possibilities?”

Critical Thinking

Combination of analytical and creative thinking

Many people perceive critical thinking just as analytical thinking . However, critical thinking incorporates both analytical thinking and creative thinking. Critical thinking does involve breaking down information into parts and analyzing the parts in a logical, step-by-step manner. However, it also involves challenging consensus to formulate new creative ideas and generate innovative solutions. It is critical thinking that helps to evaluate and improve your creative ideas.

Critical Thinking Skills

Elements of Critical Thinking

Critical thinking involves:

  • Gathering relevant information
  • Evaluating information
  • Asking questions
  • Assessing bias or unsubstantiated assumptions
  • Making inferences from the information and filling in gaps
  • Using abstract ideas to interpret information
  • Formulating ideas
  • Weighing opinions
  • Reaching well-reasoned conclusions
  • Considering alternative possibilities
  • Testing conclusions
  • Verifying if evidence/argument support the conclusions

Developing Critical Thinking Skills

Critical thinking is considered a higher order thinking skills, such as analysis, synthesis, deduction, inference, reason, and evaluation. In order to demonstrate critical thinking, you would need to develop skills in;

Interpreting : understanding the significance or meaning of information

Analyzing : breaking information down into its parts

Connecting : making connections between related items or pieces of information.

Integrating : connecting and combining information to better understand the relationship between the information.

Evaluating : judging the value, credibility, or strength of something

Reasoning : creating an argument through logical steps

Deducing : forming a logical opinion about something based on the information or evidence that is available

Inferring : figuring something out through reasoning based on assumptions and ideas

Generating : producing new information, ideas, products, or ways of viewing things.

Blooms Taxonomy

Bloom's Taxonomy Revised

Mind Mapping

Chunking Information


critical thinking evaluation examples

Copyright © 2024 | WordPress Theme by MH Themes

web analytics

Critical thinking definition

critical thinking evaluation examples

Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement.

Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process, which is why it's often used in education and academics.

Some even may view it as a backbone of modern thought.

However, it's a skill, and skills must be trained and encouraged to be used at its full potential.

People turn up to various approaches in improving their critical thinking, like:

  • Developing technical and problem-solving skills
  • Engaging in more active listening
  • Actively questioning their assumptions and beliefs
  • Seeking out more diversity of thought
  • Opening up their curiosity in an intellectual way etc.

Is critical thinking useful in writing?

Critical thinking can help in planning your paper and making it more concise, but it's not obvious at first. We carefully pinpointed some the questions you should ask yourself when boosting critical thinking in writing:

  • What information should be included?
  • Which information resources should the author look to?
  • What degree of technical knowledge should the report assume its audience has?
  • What is the most effective way to show information?
  • How should the report be organized?
  • How should it be designed?
  • What tone and level of language difficulty should the document have?

Usage of critical thinking comes down not only to the outline of your paper, it also begs the question: How can we use critical thinking solving problems in our writing's topic?

Let's say, you have a Powerpoint on how critical thinking can reduce poverty in the United States. You'll primarily have to define critical thinking for the viewers, as well as use a lot of critical thinking questions and synonyms to get them to be familiar with your methods and start the thinking process behind it.

Are there any services that can help me use more critical thinking?

We understand that it's difficult to learn how to use critical thinking more effectively in just one article, but our service is here to help.

We are a team specializing in writing essays and other assignments for college students and all other types of customers who need a helping hand in its making. We cover a great range of topics, offer perfect quality work, always deliver on time and aim to leave our customers completely satisfied with what they ordered.

The ordering process is fully online, and it goes as follows:

  • Select the topic and the deadline of your essay.
  • Provide us with any details, requirements, statements that should be emphasized or particular parts of the essay writing process you struggle with.
  • Leave the email address, where your completed order will be sent to.
  • Select your prefered payment type, sit back and relax!

With lots of experience on the market, professionally degreed essay writers , online 24/7 customer support and incredibly low prices, you won't find a service offering a better deal than ours.

Register now

How it works

Transform your enterprise with the scalable mindsets, skills, & behavior change that drive performance.

Explore how BetterUp connects to your core business systems.

Build leaders that accelerate team performance and engagement.

Unlock performance potential at scale with AI-powered curated growth journeys.

Build resilience, well-being and agility to drive performance across your entire enterprise.

Transform your business, starting with your sales leaders.

Unlock business impact from the top with executive coaching.

Foster a culture of inclusion and belonging.

Accelerate the performance and potential of your agencies and employees.

See how innovative organizations use BetterUp to build a thriving workforce.

Discover how BetterUp measurably impacts key business outcomes for organizations like yours.

A demo is the first step to transforming your business. Meet with us to develop a plan for attaining your goals.

Request a demo

  • For Individuals

Best practices, research, and tools to fuel individual and business growth.

View on-demand BetterUp events and learn about upcoming live discussions.

The latest insights and ideas for building a high-performing workplace.

  • BetterUp Briefing

The online magazine that helps you understand tomorrow's workforce trends, today.

Innovative research featured in peer-reviewed journals, press, and more.

Founded in 2022 to deepen the understanding of the intersection of well-being, purpose, and performance

We're on a mission to help everyone live with clarity, purpose, and passion.

Join us and create impactful change.

Read the buzz about BetterUp.

Meet the leadership that's passionate about empowering your workforce.

For Business

31 examples of problem solving performance review phrases

Understand Yourself Better:

Big 5 Personality Test

Find my Coach

Jump to section

You're doing great

You should think of improving

Tips to improve

Use these practical examples of phrases, sample comments, and templates for your performance review , 360-degree feedback survey, or manager appraisal.

The following examples not only relate to problem-solving but also conflict management , effective solutions, selecting the best alternatives, decision making , problem identification, analyzing effectively, and generally becoming an effective problem-solving strategist. Start using effective performance review questions to help better guide your workforce's development. 

Problem solving appraisal comments: you're doing great

  • You always maintain an effective dialogue with clients when they have technical problems. Being clear and articulate makes sure our customers' faults are attended to promptly.
  • You constantly make sure to look beyond the obvious you never stop at the first answer. You’re really good at exploring alternatives. Well done!
  • Keeping the supervisors and managers informed of status changes and requests is important. You’re really good at communicating the changes to the projects at all times. Keep it up!
  • You stay cool and collected even when things aren’t going according to plan or up in the air. This is a great trait to possess. Well done!
  • You’re excellent at giving an honest and logical analysis. Keep it up! Effectively diagnosing complex problems and reaching sustainable solutions is one of your strong points.
  • Your ability to ability to make complex systems into simple ones is truly a unique skill to possess. Well done!
  • You often identify practical solutions to every roadblock. You’re a real asset to the team! Great job.
  • You always listen actively and attentively to make sure you understand what the exact problem is and you come up with solutions in an effective manner.
  • You have an amazing ability to clearly explain options and solutions effectively and efficiently. Well done!
  • When driving projects, you can shift to other areas comfortably and easily. making sure the project runs smoothly. Great job!


Problem solving performance review phrases: you should think of improving

  • You always seem too overwhelmed when faced with multiple problems. Try to think of ways to make problems more manageable so that they can be solved in a timely and effective manner.
  • Avoiding conflicts constantly with people is not a good idea as you will only build up personal frustration and nothing will be done to remedy the situation. Try to face people when there are problems and rectify problems when they occur.
  • Don’t allow demanding customers to rattle your cage too much. If they become too demanding, take a step back, regulate your emotions , and try to make use of online support tools to help you rectify problems these tools can help a lot!
  • It’s necessary that you learn from your past mistakes . You cannot keep making the same mistakes , as this is not beneficial to the company.
  • You tend to ask the same questions over and over again. Try to listen more attentively or take notes when colleagues are answering!
  • Providing multiple solutions in an indirect and creative approach will allow you to be more effective at problem-solving . if you struggle with this typically through viewing the problem in a new and unusual light.
  • You fail to provide staff with the appropriate amount of structure and direction. They must know the direction you wish them to go in to achieve their goals .
  • You need to be able to recognize repetitive trends to solve problems promptly.
  • You tend to have problems troubleshooting even the most basic of questions. As a problem solver and customer support person, it’s imperative that you can answer these questions easily.
  • Read through your training manual and make sure you fully understand it before attempting questions again.


Performance review tips to improve problem solving

  • Try to complain less about problems and come up with solutions to the problems more often. Complaining is not beneficial to progression and innovation.
  • As a problem solver, it’s important to be able to handle multiple priorities under short deadlines.
  • You need to be able to effectively distinguish between the cause and the symptoms of problems to solve them in an efficient and timely manner.
  • Try to anticipate problems in advance before they become major roadblocks down the road.
  • Try to view obstacles as opportunities to learn and thrive at the challenge of solving the problem.
  • Remember to prioritize problems according to their degree of urgency. It's important that you spend the majority of your time on urgent tasks over menial ones.
  • When putting plans into place, stick to them and make sure they are completed.
  • When solving problems, try to allocate appropriate levels of resources when undertaking new projects. It is important to become as efficient and as effective as possible.
  • Try to learn to pace yourself when solving problems to avoid burnout . You’re a great asset to the team and we cannot afford to lose at this point.
  • Meeting regularly with your staff to review results is vital to the problem-solving process.
  • Staff that has regular check-ins understand what it is that is required of them, what they are currently achieving, and areas they may need to improve. Try to hold one-on-one meetings every week.

New call-to-action

Madeline Miles

Madeline is a writer, communicator, and storyteller who is passionate about using words to help drive positive change. She holds a bachelor's in English Creative Writing and Communication Studies and lives in Denver, Colorado. In her spare time, she's usually somewhere outside (preferably in the mountains) — and enjoys poetry and fiction.

10 performance review tips to drastically move the needle

25 performance review questions (and how to use them), how a performance review template improves the feedback process, 37 innovation and creativity appraisal comments, 6 surefire ways to reach optimal peak performance, leading for purpose and performance: insights from the collaborative, agile performance management: how to improve an agile team, 5 tactics for managing managers effectively — and why it matters, 18 questions to ask in a performance self-evaluation, similar articles, 10 problem-solving strategies to turn challenges on their head, teamwork skills self-appraisal comments: 40 example phrases, your complete guide to self-assessments (with examples), 30 communication feedback examples, 30 customer service review examples to develop your team, 8 creative solutions to your most challenging problems, overcoming resistance to change within your organization, stay connected with betterup, get our newsletter, event invites, plus product insights and research..

3100 E 5th Street, Suite 350 Austin, TX 78702

  • Platform Overview
  • Integrations
  • BetterUp Lead
  • BetterUp Manage™
  • BetterUp Care™
  • Sales Performance
  • Diversity & Inclusion
  • Case Studies
  • Why BetterUp?
  • News and Press
  • Leadership Team
  • Become a BetterUp Coach
  • BetterUp Labs
  • Center for Purpose & Performance
  • Leadership Training
  • Business Coaching
  • Contact Support
  • Contact Sales
  • Privacy Policy
  • Acceptable Use Policy
  • Trust & Security
  • Cookie Preferences

16 Critical Thinking Examples in Real Life

What is critical thinking.

While making your academic assignments or thesis, you are required to do some research and analyze various things, or for making a career decision or any other decision you are required to think of all pros and cons of that decision. Well, the most important thing that helps us to effectively take these decisions is what we call critical thinking. Critical thinking is very important in both personal and professional life. The process of critical thinking involves the analysis of the various facts and figures in a particular situation before straightaway acting on that situation. Critical thinking demands keen observation, creativity, problem-solving skills, which helps the individual to thoroughly evaluate the gathered information and then use this available information as a guide to making accurate decisions. From doing academic works or regular activities to solving various large scale problems, critical thinking is required in everyday life. In this article, we will learn about some real-life examples where critical thinking plays an important role.

Critical Thinking Examples in Real Life

1. critical thinking in problem solving.

Suppose your manager asks you to find an effective solution to a problem that is affecting the business. What would be your first step? Like most people, you may also start looking for potential solutions to deal with that situation. Well, one requires the use of critical thinking here. Before looking for the solution one needs to take a step back and try to understand the cause of the problem first. One should ask for the opinions of the other people that how does this particular problem impact them and the overall business. If you arrive at a solution, you should not only just rely on one solution, instead, you should always have various backup plans in case the first solution does not work as expected. Most people feel that they are great at problem-solving, but if one is not following all these above discussed steps before making a final judgement, he/she is not a critical thinker. Critical thinking allows people to find the best possible solution to any problem. Critical thinking is an important factor of problem-solving skills, one needs to look at any situation from multiple perspectives because in some cases, your decisions not only impact you but also the people in your surrounding.

2. Critical Thinking in Analysing Risks

Risk assessment is another important factor, which requires the use of critical thinking. Risk assessment is required in various sectors, from children analysing the impact of eating junk food on their health to large businesses in analysing the impact of certain policies on the growth of the company. Let us understand the implication of critical thinking in analysing the risks with some examples.

3. Critical Thinking in Data Analysis

Whether analysing the performance of the children in the schools or analysing the business growth of a multi-national company, the skill of data analysis is very crucial. In today’s era, almost every sector demands experts that can accurately evaluate the available data or information and draw out effective conclusions from it. With the rise in technology, the various tasks of the data analysis such as finding profit and loss, creating balance sheets, and issuing invoices are done with the help of various software, but it does not mean that human skill is not required. Various kinds of software can just convert a large amount of data into some simpler and readable format, but it is the critical thinking of the humans that is required to effectively interpret the data and apply the obtained insight for the benefits. The data analysis can even help us to estimate the future trends and potential risks of taking any decisions.

4. Critical Thinking in Hiring Employees

The ability to objectively view any situation without getting influenced by your personal beliefs or thoughts is one of the important characteristics of critical thinking. In business, the hiring managers require critical thinking to evaluate a large number of resume’s to choose the suitable candidates for the required position. Critical thinking here enables the hiring managers not to hire a candidate on the basis of various factors like gender, age, religion or country, these factors may influence the hiring managers unconsciously. The hiring manager may tend to choose the candidate on his/her subjective beliefs if he/she does not use critical thinking. Hence, critical thinking can help HR’s to hire the best employees that may eventually lead to the growth of the company.

5. Promoting the Teamwork

In a team, every individual is unique and has his/her different ideas to tackle the proposed problem. It is the responsibility of the team leader to understand the perspective of each member and encourage them to work collectively to solve the common problem. You may find the opinion of the other members of your team as ineffective, but instead of straightway denying their opinions one should logically analyse their suggestions and try to put your point of view regarding the problem in an effective and calm manner. If the team leader does not use critical thinking, instead, he/she boost his/her opinions on others, the team is sure to collapse.

6. Critical Thinking in Self-Evaluation

Critical thinking plays a major role in self-evaluation. The knowledge of critical thinking skills allows you to accurately analyse your performance by controlling various subjective biases. People should always evaluate their reactions towards any situation and the way they think, this may help them to get a deep insight into their thought processes, hence improving their thinking abilities to take accurate decisions. Self-evaluation is very important in professional life too. Suppose your manager has set a new target for the company. Every employee is thus required to analyse his/her contribution to the company and try to accomplish the set target. If you know your contribution to the company, it will help you to analyse your performance, and you can try to improve your performance in the areas where you lag.

7. Critical Thinking in Choosing the Career

Almost all of us face various dilemmas in our lives such as choosing the stream, the type of job, choosing between the regular college degree or the online programme. Whatever you choose, every option has its pros and cons. However, critical thinking allows us to accurately weigh the positives and negatives of each option and choose the one that offers more benefits than drawbacks. The best way to do this is to make a list of the pros and the cons and then analyse. Well, this is not just limited to choosing the career path, it can be used in other situations also such as professionally, and financially. One can list the pros and cons of selecting to work in a specific company or choosing the right insurance plan. It is often seen that our choices are greatly influenced by the choices of our friends or known, but one should understand that every individual’s beliefs, desires, and ambitions are different so, if the particular carrear or job is best for the others it does not mean that it would be the best option for you also. Hence, to choose the right carrear path, one requires critical thinking.

8. Critical Thinking in Time Management

Time is the most valuable asset that we have, hence utilizing it appropriately is very crucial. Critical thinking in time management helps you to wisely plan your schedule according to the importance of the particular task or the activity. For example, if the task to which you devote most of your time, is not giving you much return then you need to reconsider your schedule and should devote more time to the tasks that give you high returns.

9. Critical Thinking in Analysing the Fake News

Suppose, one of your friends shares a piece of news with you. Do you bother to analyse that whether this piece of news is real or not? Many of us just believe in the news and shares this with others too without thinking that this can be fake news too. A study conducted by Stanford University showed that around 82 per cent of the teenagers failed to distinguish between the real news and the advertisement with the ‘sponsored content’ label. This problem arises because the standard education curriculum does not emphasise much on critical thinking skills much because of the assumption that critical thinking is inbuilt in every person. By introducing certain lessons or activities that may help to increase the knowledge or overall thinking skills, the critical thinking of the children can be improved. Well, it is also seen that not only children, but adults also fall for these fake news and articles that circulate on various social media platforms. Before believing any piece of information, one should think of various questions like the source of the publication, the intention of the article, the author of the article, and the agenda behind the article. Critical thinking helps us to precisely evaluate any information before straightway believing it.

10. Critical Thinking in Distinguishing between Right and Wrong

Most people, especially teenagers are very much conscious about what their friends or relatives think of their behaviour. You may have had been through the situation, wherein if your friends think that certain behaviour is cool then you start acting in that way to fit in your friend’s circle without even considering that what you are doing is good or bad, and is your actions are related to your beliefs or not? One should understand that if a certain behaviour seems cool to some people, it may also seem bad to some others. One should not change his/her actions depending upon the approval of certain people, rather one should look at the broader aspect and should deeply analyse that whether their actions are morally right or wrong.

11. Critical Thinking in Decoding Fashion Trends

Nowadays, some people are so crazy about following the latest fashion trends, they start following every trend that some popular actor, actress, or fashion influencer suggest. If you are a critical thinker you may have had thought of the questions like why the particular trend that was so popular a few years back seems foolish now? why does a particular trend that does not even look good is so popular? Do the particular fashion trend that suits the other person suits yourself or not? Critical thinking helps people from falling victim to the bandwagon fallacy; it is fallacy in which people starts believing a particular thing or idea as good or bad if the majority of the population thinks so. Fashion trends are a common example of bandwagon fallacy.

12. Critical Thinking in Choosing the Suitable Diet and Exercise

You must have heard of various types of diets such as the Keto diet, Whole 30 diet, Gluten-free diet, Vegan diet and so on. It seems complex to choose the diet that is best for you. What people usually do is that they search online, go through several videos and choose the diet that showed the best results to the person in the video. Well, this is not the right approach, choosing the best diet for yourself requires critical thinking. People who use critical thinking evaluate the pros and cons of the particular diet on their own body, they generally ask about the suitable diet from professional dieticians rather than just following the advice of a random person online. Like choosing a suitable diet, choosing a suitable exercise also demands critical thinking. For example, What are your goals? How can you achieve this? At what time you can do exercise? Do you have any injuries that may get affected by the particular exercise? People who use critical thinking tend to ask all these questions, and then by utilizing the knowledge they have and the following routine for a few weeks, and by analyzing the results they are getting from it, they finally plan a proper schedule for them.

13. Critical Thinking in Online Shopping

In today’s digital era, online shopping is preferred by most people. However, there are various tactics and psychological tricks such as the anchoring effect , Stroop effect , and Serial position effect that are used by the various e-commerce websites, which makes the customers buy more things or things that they don’t even need. Critical thinking can help people to smartly buy items without falling victim to all these effects or tactics. While making the purchase you should focus on the price that you are paying for the particular item rather than the discount you are getting on that item because the chances are that the price that you are paying for that item is not worth paying even after the discount.

14. Critical Thinking in Job Search

Critical thinking plays an important role in the Job search. If you are applying for a job, you may consider the following points to get the desired job.

Use of Keywords in Resume: One should always understand the job post and its requirements before straightaway applying for the job. It is important to update your resume according to the job and add some keywords (mentioned in the job requirements) into your resume to get the job. If you possess some critical thinking skills such as problem-solving, analytical, communication, or creativity skills, it is better to put that in your resume. However, one should always restrain from adding any random critical thinking skills that you do not possess.

Cover Letter: Hiring managers receive hundreds of resumes daily, hence the chances that they will read every resume are quite less. Well, you can make your resume different from others by adding a good cover letter. You can add some of the critical skills that you have to your resume, it is better to explain a little about the tasks or activities where you showed these skills in your previous jobs or work experiences rather than just simply writing the skill. This assures the recruiter that you are not randomly writing the skills and you possess these qualities.

Interviews: Nowadays, some interviewers present the interviewees with hypothetical stories to check their critical thinking skills. You may be asked to explain what you think of the given situation or your first reaction after looking at the given image. You are required to solve any random problem, and then you have to explain to the recruiter about your thought processes. The interviewer here is more focused on the way you reach the conclusion rather than the conclusion itself. Your thought process helps the interviewer to analyse and evaluate the way you approach various problems

15. Critical Thinking While Driving

Imagine you are driving on a busy road and your phone starts ringing. It’s an urgent call that you have to pick. What would you do? Would you pick up the call and risk yourself into an accident or stop your car on the roadside to take the call. Critical thinking helps you to make accurate decisions while driving, it includes finding the right place to park your car, analysing whether you can pass the car through that narrow street or not, or how to handle if any animal suddenly comes in front of your car. Hence, critical thinking is must require skill in driving.

16. Critical Thinking in Business

Critical thinking is one of the most important things that the owner of the business needs to possess. One has to make several important decisions, effectively communicate with the clients, hire suitable employees, take certain risks, and deal with several ups and downs in the business, and much more; all these things require critical thinking.

Related Posts

17 Monopoly Examples in Real Life

17 Monopoly Examples in Real Life

Hypothesis Types

Hypothesis Types

4 Virtue Ethics Examples in Real Life

4 Virtue Ethics Examples in Real Life

Backward and Forward Integration

Backward and Forward Integration

3 Allegory of the Cave Examples in Real Life

3 Allegory of the Cave Examples in Real Life

8 Placebo Effect Examples in Real Life

8 Placebo Effect Examples in Real Life

' src=

Great post! I’ve been trying to apply critical thinking to my life, and these examples are a great way to start.

' src=

critical thinking is what anyone of us should have in spoiled world

Add Comment Cancel Reply

logo that says helpful professor with a mortarboard hat picture next to it

33 Critical Analysis Examples

critical analysis examples and definition, explained below

Critical analysis refers to the ability to examine something in detail in preparation to make an evaluation or judgment.

It will involve exploring underlying assumptions, theories, arguments, evidence, logic, biases, contextual factors, and so forth, that could help shed more light on the topic.

In essay writing, a critical analysis essay will involve using a range of analytical skills to explore a topic, such as:

  • Evaluating sources
  • Exploring strengths and weaknesses
  • Exploring pros and cons
  • Questioning and challenging ideas
  • Comparing and contrasting ideas

If you’re writing an essay, you could also watch my guide on how to write a critical analysis essay below, and don’t forget to grab your worksheets and critical analysis essay plan to save yourself a ton of time:

Grab your Critical Analysis Worksheets and Essay Plan Here


Critical Analysis Examples

1. exploring strengths and weaknesses.

Perhaps the first and most straightforward method of critical analysis is to create a simple strengths-vs-weaknesses comparison.

Most things have both strengths and weaknesses – you could even do this for yourself! What are your strengths? Maybe you’re kind or good at sports or good with children. What are your weaknesses? Maybe you struggle with essay writing or concentration.

If you can analyze your own strengths and weaknesses, then you understand the concept. What might be the strengths and weaknesses of the idea you’re hoping to critically analyze?

Strengths and weaknesses could include:

  • Does it seem highly ethical (strength) or could it be more ethical (weakness)?
  • Is it clearly explained (strength) or complex and lacking logical structure (weakness)?
  • Does it seem balanced (strength) or biased (weakness)?

You may consider using a SWOT analysis for this step. I’ve provided a SWOT analysis guide here .

2. Evaluating Sources

Evaluation of sources refers to looking at whether a source is reliable or unreliable.

This is a fundamental media literacy skill .

Steps involved in evaluating sources include asking questions like:

  • Who is the author and are they trustworthy?
  • Is this written by an expert?
  • Is this sufficiently reviewed by an expert?
  • Is this published in a trustworthy publication?
  • Are the arguments sound or common sense?

For more on this topic, I’d recommend my detailed guide on digital literacy .

3. Identifying Similarities

Identifying similarities encompasses the act of drawing parallels between elements, concepts, or issues.

In critical analysis, it’s common to compare a given article, idea, or theory to another one. In this way, you can identify areas in which they are alike.

Determining similarities can be a challenge, but it’s an intellectual exercise that fosters a greater understanding of the aspects you’re studying. This step often calls for a careful reading and note-taking to highlight matching information, points of view, arguments or even suggested solutions.

Similarities might be found in:

  • The key themes or topics discussed
  • The theories or principles used
  • The demographic the work is written for or about
  • The solutions or recommendations proposed

Remember, the intention of identifying similarities is not to prove one right or wrong. Rather, it sets the foundation for understanding the larger context of your analysis, anchoring your arguments in a broader spectrum of ideas.

Your critical analysis strengthens when you can see the patterns and connections across different works or topics. It fosters a more comprehensive, insightful perspective. And importantly, it is a stepping stone in your analysis journey towards evaluating differences, which is equally imperative and insightful in any analysis.

4. Identifying Differences

Identifying differences involves pinpointing the unique aspects, viewpoints or solutions introduced by the text you’re analyzing. How does it stand out as different from other texts?

To do this, you’ll need to compare this text to another text.

Differences can be revealed in:

  • The potential applications of each idea
  • The time, context, or place in which the elements were conceived or implemented
  • The available evidence each element uses to support its ideas
  • The perspectives of authors
  • The conclusions reached

Identifying differences helps to reveal the multiplicity of perspectives and approaches on a given topic. Doing so provides a more in-depth, nuanced understanding of the field or issue you’re exploring.

This deeper understanding can greatly enhance your overall critique of the text you’re looking at. As such, learning to identify both similarities and differences is an essential skill for effective critical analysis.

My favorite tool for identifying similarities and differences is a Venn Diagram:

venn diagram

To use a venn diagram, title each circle for two different texts. Then, place similarities in the overlapping area of the circles, while unique characteristics (differences) of each text in the non-overlapping parts.

6. Identifying Oversights

Identifying oversights entails pointing out what the author missed, overlooked, or neglected in their work.

Almost every written work, no matter the expertise or meticulousness of the author, contains oversights. These omissions can be absent-minded mistakes or gaps in the argument, stemming from a lack of knowledge, foresight, or attentiveness.

Such gaps can be found in:

  • Missed opportunities to counter or address opposing views
  • Failure to consider certain relevant aspects or perspectives
  • Incomplete or insufficient data that leaves the argument weak
  • Failing to address potential criticism or counter-arguments

By shining a light on these weaknesses, you increase the depth and breadth of your critical analysis. It helps you to estimate the full worth of the text, understand its limitations, and contextualize it within the broader landscape of related work. Ultimately, noticing these oversights helps to make your analysis more balanced and considerate of the full complexity of the topic at hand.

You may notice here that identifying oversights requires you to already have a broad understanding and knowledge of the topic in the first place – so, study up!

7. Fact Checking

Fact-checking refers to the process of meticulously verifying the truth and accuracy of the data, statements, or claims put forward in a text.

Fact-checking serves as the bulwark against misinformation, bias, and unsubstantiated claims. It demands thorough research, resourcefulness, and a keen eye for detail.

Fact-checking goes beyond surface-level assertions:

  • Examining the validity of the data given
  • Cross-referencing information with other reliable sources
  • Scrutinizing references, citations, and sources utilized in the article
  • Distinguishing between opinion and objectively verifiable truths
  • Checking for outdated, biased, or unbalanced information

If you identify factual errors, it’s vital to highlight them when critically analyzing the text. But remember, you could also (after careful scrutiny) also highlight that the text appears to be factually correct – that, too, is critical analysis.

8. Exploring Counterexamples

Exploring counterexamples involves searching and presenting instances or cases which contradict the arguments or conclusions presented in a text.

Counterexamples are an effective way to challenge the generalizations, assumptions or conclusions made in an article or theory. They can reveal weaknesses or oversights in the logic or validity of the author’s perspective.

Considerations in counterexample analysis are:

  • Identifying generalizations made in the text
  • Seeking examples in academic literature or real-world instances that contradict these generalizations
  • Assessing the impact of these counterexamples on the validity of the text’s argument or conclusion

Exploring counterexamples enriches your critical analysis by injecting an extra layer of scrutiny, and even doubt, in the text.

By presenting counterexamples, you not only test the resilience and validity of the text but also open up new avenues of discussion and investigation that can further your understanding of the topic.

See Also: Counterargument Examples

9. Assessing Methodologies

Assessing methodologies entails examining the techniques, tools, or procedures employed by the author to collect, analyze and present their information.

The accuracy and validity of a text’s conclusions often depend on the credibility and appropriateness of the methodologies used.

Aspects to inspect include:

  • The appropriateness of the research method for the research question
  • The adequacy of the sample size
  • The validity and reliability of data collection instruments
  • The application of statistical tests and evaluations
  • The implementation of controls to prevent bias or mitigate its impact

One strategy you could implement here is to consider a range of other methodologies the author could have used. If the author conducted interviews, consider questioning why they didn’t use broad surveys that could have presented more quantitative findings. If they only interviewed people with one perspective, consider questioning why they didn’t interview a wider variety of people, etc.

See Also: A List of Research Methodologies

10. Exploring Alternative Explanations

Exploring alternative explanations refers to the practice of proposing differing or opposing ideas to those put forward in the text.

An underlying assumption in any analysis is that there may be multiple valid perspectives on a single topic. The text you’re analyzing might provide one perspective, but your job is to bring into the light other reasonable explanations or interpretations.

Cultivating alternative explanations often involves:

  • Formulating hypotheses or theories that differ from those presented in the text
  • Referring to other established ideas or models that offer a differing viewpoint
  • Suggesting a new or unique angle to interpret the data or phenomenon discussed in the text

Searching for alternative explanations challenges the authority of a singular narrative or perspective, fostering an environment ripe for intellectual discourse and critical thinking . It nudges you to examine the topic from multiple angles, enhancing your understanding and appreciation of the complexity inherent in the field.

A Full List of Critical Analysis Skills

  • Exploring Strengths and Weaknesses
  • Evaluating Sources
  • Identifying Similarities
  • Identifying Differences
  • Identifying Biases
  • Hypothesis Testing
  • Fact-Checking
  • Exploring Counterexamples
  • Assessing Methodologies
  • Exploring Alternative Explanations
  • Pointing Out Contradictions
  • Challenging the Significance
  • Cause-And-Effect Analysis
  • Assessing Generalizability
  • Highlighting Inconsistencies
  • Reductio ad Absurdum
  • Comparing to Expert Testimony
  • Comparing to Precedent
  • Reframing the Argument
  • Pointing Out Fallacies
  • Questioning the Ethics
  • Clarifying Definitions
  • Challenging Assumptions
  • Exposing Oversimplifications
  • Highlighting Missing Information
  • Demonstrating Irrelevance
  • Assessing Effectiveness
  • Assessing Trustworthiness
  • Recognizing Patterns
  • Differentiating Facts from Opinions
  • Analyzing Perspectives
  • Prioritization
  • Making Predictions
  • Conducting a SWOT Analysis
  • PESTLE Analysis
  • Asking the Five Whys
  • Correlating Data Points
  • Finding Anomalies Or Outliers
  • Comparing to Expert Literature
  • Drawing Inferences
  • Assessing Validity & Reliability

Analysis and Bloom’s Taxonomy

Benjamin Bloom placed analysis as the third-highest form of thinking on his ladder of cognitive skills called Bloom’s Taxonomy .

This taxonomy starts with the lowest levels of thinking – remembering and understanding. The further we go up the ladder, the more we reach higher-order thinking skills that demonstrate depth of understanding and knowledge, as outlined below:

blooms taxonomy, explained below

Here’s a full outline of the taxonomy in a table format:


Chris Drew (PhD)

Dr. Chris Drew is the founder of the Helpful Professor. He holds a PhD in education and has published over 20 articles in scholarly journals. He is the former editor of the Journal of Learning Development in Higher Education. [Image Descriptor: Photo of Chris]

  • Chris Drew (PhD) 31 Great Teachable Moment Examples
  • Chris Drew (PhD) SOLO Taxonomy - 5 Levels of Learning Complexity
  • Chris Drew (PhD) Remedial Education - Advantages, Disadvantages & Examples
  • Chris Drew (PhD) What is Hidden Curriculum? - Examples, Pros & Cons

2 thoughts on “33 Critical Analysis Examples”

' src=

THANK YOU, THANK YOU, THANK YOU! – I cannot even being to explain how hard it has been to find a simple but in-depth understanding of what ‘Critical Analysis’ is. I have looked at over 10 different pages and went down so many rabbit holes but this is brilliant! I only skimmed through the article but it was already promising, I then went back and read it more in-depth, it just all clicked into place. So thank you again!

' src=

You’re welcome – so glad it was helpful.

Leave a Comment Cancel Reply

Your email address will not be published. Required fields are marked *

Five Examples of Critical Thinking Skills

critical thinking examples

What critical thinking examples can we give to clarify the meaning of critical thinking? This article defines and provides five examples of critical thinking skills in various fields or disciplines.

Table of Contents


In teaching students, we usually use the word critical thinking for them to think on a higher level, as described in Bloom’s Taxonomy’s Categories in the Cognitive Domain. We call this the Higher Order Thinking Skills or HOTS.

But how is critical thinking skill shown? What should we look out for among other students that we can consider as demonstrative or indicator that they thought critically?

I clarify this sometimes vague concept that is always mentioned but not specifically applied during the delivery of lessons or courses. As teachers or mentors, this concept must be crystal clear in our minds, so that we can assess such a demonstration of critical thinking that we can incorporate in our rubrics.

Alright. Let’s proceed by defining first what is critical thinking. I will then proceed by giving five critical thinking examples in different disciplines.

Definition of Critical Thinking and Its Importance

Critical thinking is a crucial skill that plays a significant role in education. It involves the ability to analyze, evaluate, and interpret information logically.

Critical thinking is the ability to analyze, evaluate, and interpret information logically.

By encouraging critical thinking, educators aim to develop students’ problem-solving abilities, enhance their decision-making skills, and foster independent and creative thinking.

In today’s rapidly changing world, where information is readily available and constantly evolving, critical thinking has become even more essential. It enables individuals to navigate through the vast amount of information, distinguish between reliable and unreliable sources, and make informed judgments.

Critical thinking helps students develop a deeper understanding of the subjects they study, as they learn to question assumptions, challenge existing knowledge, and explore alternative perspectives.

By incorporating critical thinking into education, students are better equipped to face real-world challenges. They become more adaptable, open-minded, and capable of making well-reasoned decisions.

Critical thinking also promotes effective communication and collaboration, as students learn to articulate their thoughts, listen to others’ viewpoints, and engage in constructive discussions.

In the following sections, we will explore five examples of critical thinking across disciplines, including environmental science, statistics, engineering, science, and humanities. Each example will highlight how we can improve critical thinking skills through specific teaching strategies.

Critical Thinking Examples Across 5 Disciplines

In this section, we will explore five critical thinking examples across different disciplines, including environmental science, statistics, engineering, science, and humanities. Each example will highlight how we can improve critical thinking skills through specific teaching strategies .

1. Environmental Science

One example of critical thinking in environmental science is analyzing the impact of human activities on ecosystems. By teaching students to evaluate the consequences of actions such as deforestation or pollution, they can develop a deeper understanding of the interconnectedness of the environment.

Engaging students in hands-on experiments about pollution , fieldwork, and case studies can enhance their critical thinking skills by encouraging them to question assumptions, consider alternative solutions, and evaluate the long-term effects of human actions.

For instance, in a classroom setting, we can present students with a case study on the effects of deforestation on a specific ecosystem. We can then ask them to analyze the data, identify the underlying causes, and propose sustainable solutions.

By doing so, we encourage students to think critically about the complex relationship between human activities and the environment, considering both short-term and long-term consequences.

2. Statistics

Critical thinking in statistics involves interpreting and analyzing data to make informed decisions. Teaching students to question the validity of data sources, identify biases, and analyze statistical methods can improve their critical thinking skills.

Incorporating real-world examples, interactive data analysis exercises, and group discussions can enhance students’ ability to evaluate the reliability of statistical information and draw accurate conclusions.

For example, we can give students a dataset and ask them to evaluate critically the method or methodology used to collect the data, identify any potential biases, and draw meaningful conclusions.

By engaging in group discussions, students can compare their findings, challenge each other’s assumptions, and develop a deeper understanding of the limitations and strengths of statistical analysis .

3. Engineering

Critical thinking in engineering involves problem-solving and innovation. By presenting students with complex engineering challenges, educators can foster critical thinking skills.

Encouraging students to brainstorm, analyze constraints, and propose creative solutions can enhance their ability to think critically. Incorporating project-based learning, teamwork, and hands-on experiments can further develop their critical thinking skills in the engineering field.

For instance, we can task students with designing and building a prototype to solve a specific engineering problem. Throughout the process, they are required to think critically about the constraints, consider alternative approaches, and evaluate the feasibility of their solutions.

By working collaboratively in teams, students can also learn from each other’s perspectives and develop a more comprehensive understanding of the problem at hand.

Critical thinking in science involves questioning existing theories, designing experiments, and analyzing results. By teaching students to challenge assumptions, evaluate evidence, and draw logical conclusions, educators can enhance their critical thinking skills.

Engaging students in scientific inquiry, encouraging them to develop hypotheses, and providing opportunities for peer review and scientific debate can further improve their ability to think critically.

For example, we can give students a scientific research paper and have them critically evaluate the method or methodology , analyze the results, and draw conclusions based on the evidence presented.

By engaging in peer review and scientific debate, students can refine their critical thinking skills by challenging each other’s interpretations, identifying potential flaws in the research, and proposing alternative explanations.

5. Humanities

Critical thinking in humanities involves analyzing and interpreting texts, artworks, and historical events. By teaching students to question biases, analyze multiple perspectives, and evaluate evidence, educators can enhance their critical thinking skills. Incorporating class discussions, debates, and critical analysis of primary and secondary sources can further develop students’ ability to think critically in the humanities.

For instance, we can assign students a historical event and request them to analyze primary and secondary sources critically, in order to gain a deeper understanding of the event from multiple perspectives.

By engaging in class discussions and debates, students can develop their critical thinking skills by challenging prevailing narratives, questioning biases, and evaluating the reliability of different sources.

By exploring these five examples, we can see that specific teaching strategies in various disciplines can improve critical thinking skills. These examples show the importance of incorporating critical thinking into education to equip students with the skills necessary to navigate complex challenges and make informed decisions.

Conclusions and Recommendations

Based on the discussion in the previous section, critical thinking skills are essential across various disciplines. To effectively develop these skills, educators should employ specific teaching strategies that encourage students to think critically.

In conclusion, to develop critical thinking skills, educators should employ teaching strategies as shown in the five critical thinking examples, such as hands-on experiments, real-world examples, project-based learning, and critical analysis. By incorporating these strategies, students can navigate complex challenges, make informed decisions, and become critical thinkers in their respective fields.

Related Posts

How To Make Meaningful Connections During Your First Engineering Internship

How To Make Meaningful Connections During Your First Engineering Internship

The importance of time management and exercise.

Hybrid Learning vs Online Classes: 37 Key Points to Ponder

Hybrid Learning vs Online Classes: 37 Key Points to Ponder

About the author, patrick regoniel.

Dr. Regoniel, a faculty member of the graduate school, served as consultant to various environmental research and development projects covering issues and concerns on climate change, coral reef resources and management, economic valuation of environmental and natural resources, mining, and waste management and pollution. He has extensive experience on applied statistics, systems modelling and analysis, an avid practitioner of LaTeX, and a multidisciplinary web developer. He leverages pioneering AI-powered content creation tools to produce unique and comprehensive articles in this website.

Leave a Reply Cancel Reply


  1. Evaluation [critical thinking skills]

    critical thinking evaluation examples

  2. Critical thinking during the evaluation phase

    critical thinking evaluation examples

  3. Critical Thinking

    critical thinking evaluation examples

  4. Digital Literacy Skills: Critical Thinking

    critical thinking evaluation examples

  5. Evaluation (Critical Thinking Skills)

    critical thinking evaluation examples

  6. Evaluation: Critical Thinking Skills

    critical thinking evaluation examples


  1. Learning Outcomes Of Critical Thinking

  2. Critical Thinking

  3. Reflective Writing & Critical Thinking||Unit-1||Part-1||TLP||Bsn 5th semester||In Urdu/English

  4. What is Critical Thinking ?

  5. Critical Thinking

  6. critical thinking


  1. Critical Thinking: 40 Useful Performance Feedback Phrases

    Learn how to craft meaningful performance evaluations with these sample phrases for critical thinking skills. Find out the levels of critical thinking (exceeds, meets, needs improvement) and the self-evaluation questions to measure your own performance.

  2. Critical Thinking: Performance Review Examples (Rating 1

    Paragraph Example 1 "Jane consistently demonstrates outstanding critical thinking skills in her role. She not only engages in deep analysis of complex information, but she also presents unique solutions to problems that have a significant positive impact on the team's performance.

  3. 6 Main Types of Critical Thinking Skills (With Examples)

    1. Analytical thinking Being able to properly analyze information is the most important aspect of critical thinking. This implies gathering information and interpreting it, but also skeptically evaluating data.

  4. Critical Thinking Definition, Skills, and Examples

    Critical thinking is the ability to analyze information objectively and make a reasoned judgment. It involves the evaluation of sources, such as data, facts, observable phenomena, and research findings. Learn how to demonstrate your critical thinking skills in your resume, cover letter, and interview with examples of how you apply them in different situations.

  5. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. Learn how to think critically with examples from academic and nonacademic contexts, and how to identify reliable sources, evaluate arguments, and test hypotheses.

  6. Promoting and Assessing Critical Thinking

    Below are some example generic question stems that can serve as prompts to aid in generating critical thinking questions. Consider providing prompts such as these to students to facilitate their ability to also ask these questions of themselves and others.

  7. 25 Critical Thinking Examples (2024)

    Critical Thinking Examples 1. Identifying Strengths and Weaknesses Critical thinkers don't just take things at face value. They stand back and contemplate the potential strengths and weaknesses of something and then make a decision after contemplation. This helps you to avoid excessive bias and identify possible problems ahead of time.

  8. Frontiers

    CT generally comprises the following mental processes: identifying, evaluating, and analyzing a problem; interpreting information; synthesizing evidence; and reporting a conclusion. We further posit that CT also involves dealing with dilemmas involving ambiguity or conflicts among principles and contradictory information.

  9. Critical Thinking Examples

    This is just one of many examples of critical thinking for the following section. Below are more critical thinking examples. Self-evaluation of your actions. HR manager resolving conflict between staffs. A military officer working on tactical plans. Professor guiding students to fresh ideas with creative questioning.

  10. Evaluative thinking

    Evaluation is a form of critical thinking that involves examining evidence to make a judgement. Evaluative claims have two parts: a conclusion and an explanation. For example: xyz was great, because? xyz is disappointing, because? xyz is a good way to go in this situation, because? Drawing conclusions based on intuition is not evaluation.

  11. 100+ Performance Evaluation Comments for Attitude, Training Ability

    Critical Thinking The performance review evaluates the ability to analyze issues objectively, troubleshoot problems logically, challenge assumptions constructively, and evaluate options rationally. Each section below contains positive ("Meets or Exceeds Expectations") and negative ("Below Expectations") examples for the skills listed above.

  12. Critical Thinking

    Critical Thinking is the process of using and assessing reasons to evaluate statements, assumptions, and arguments in ordinary situations. The goal of this process is to help us have good beliefs, where "good" means that our beliefs meet certain goals of thought, such as truth, usefulness, or rationality. Critical thinking is widely ...

  13. Evaluation Skills: A Comprehensive Overview

    Top Rated New Science Education Science Learning Critical Thinking Skills Evaluation skills Evaluation Skills: A Comprehensive Overview This article provides an overview of evaluation skills, including what they are, why they are important, and how to develop them.

  14. Critical Thinking Performance Review Phrases Examples

    Edit Mode Critical Thinking Sample Phrases To Write A Performance Appraisal Feedback Uses sharp ideas and critical thinking ability to solve issues quickly. Decides the most appropriate solution to an issue by considering the pros and cons of it very cautiously.

  15. Critical Thinking

    1. History 2. Examples and Non-Examples 2.1 Dewey's Three Main Examples 2.2 Dewey's Other Examples 2.3 Further Examples 2.4 Non-examples 3. The Definition of Critical Thinking 4. Its Value 5. The Process of Thinking Critically 6. Components of the Process

  16. 3 Activities to Enhance Your Evaluation in Critical Thinking

    Assess the evidence based on the following factors: Anecdotal Evidence/Personal Experience. Common Belief/Sense Statements. Of course, there is more to "evaluation" than an assessment of ...

  17. Evaluate and Critique a Model or Argument

    Examples in the ABLConnect Database The Spectrum Game Simulated Investment in Genome Editing Technology Learning through Case Construction More Being able to analyze an argument pushes students knowledge base and critical thinking skills.

  18. Critical Thinking

    Asking questions Assessing bias or unsubstantiated assumptions Making inferences from the information and filling in gaps Using abstract ideas to interpret information Formulating ideas Weighing opinions Reaching well-reasoned conclusions Considering alternative possibilities Testing conclusions

  19. 55 Effective Performance Review Phrases to Use

    Communication. Teamwork and Collaboration. Goal Achievement. Creativity, Problem-Solving, and Critical Thinking. Flexibility and Adaptability. Time Management and Dependability. Leadership and Management. 3 Things to Avoid When Writing Performance Reviews. Performance Reviews Are Just One Piece of Performance Management.

  20. Using Critical Thinking in Essays and other Assignments

    Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement. Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process ...

  21. 31 examples of problem solving performance review phrases

    The following examples not only relate to problem-solving but also conflict management, effective solutions, selecting the best alternatives, decision making, problem identification, analyzing effectively, and generally becoming an effective problem-solving strategist. Start using effective performance review questions to help better guide your ...

  22. 16 Critical Thinking Examples in Real Life

    Critical Thinking Examples in Real Life 1. Critical Thinking in Problem Solving ... Critical Thinking in Self-Evaluation. Critical thinking plays a major role in self-evaluation. The knowledge of critical thinking skills allows you to accurately analyse your performance by controlling various subjective biases. People should always evaluate ...

  23. 33 Critical Analysis Examples (2024)

    1. Exploring Strengths and Weaknesses Perhaps the first and most straightforward method of critical analysis is to create a simple strengths-vs-weaknesses comparison. Most things have both strengths and weaknesses - you could even do this for yourself! What are your strengths? Maybe you're kind or good at sports or good with children.

  24. 5 Critical Thinking Examples in Various Disciplines

    In this section, we will explore five critical thinking examples across different disciplines, including environmental science, statistics, engineering, science, and humanities. Each example will highlight how we can improve critical thinking skills through specific teaching strategies. 1. Environmental Science.