Writing Student Learning Outcomes

Student learning outcomes state what students are expected to know or be able to do upon completion of a course or program. Course learning outcomes may contribute, or map to, program learning outcomes, and are required in group instruction course syllabi .

At both the course and program level, student learning outcomes should be clear, observable and measurable, and reflect what will be included in the course or program requirements (assignments, exams, projects, etc.). Typically there are 3-7 course learning outcomes and 3-7 program learning outcomes.

When submitting learning outcomes for course or program approvals, or assessment planning and reporting, please:

  • Begin with a verb (exclude any introductory text and the phrase “Students will…”, as this is assumed)
  • Limit the length of each learning outcome to 400 characters
  • Exclude special characters (e.g., accents, umlats, ampersands, etc.)
  • Exclude special formatting (e.g., bullets, dashes, numbering, etc.)

Writing Course Learning Outcomes Video

Watch Video

Steps for Writing Outcomes

The following are recommended steps for writing clear, observable and measurable student learning outcomes. In general, use student-focused language, begin with action verbs and ensure that the learning outcomes demonstrate actionable attributes.

1. Begin with an Action Verb

Begin with an action verb that denotes the level of learning expected. Terms such as know , understand , learn , appreciate are generally not specific enough to be measurable. Levels of learning and associated verbs may include the following:

  • Remembering and understanding: recall, identify, label, illustrate, summarize.
  • Applying and analyzing: use, differentiate, organize, integrate, apply, solve, analyze.
  • Evaluating and creating: Monitor, test, judge, produce, revise, compose.

Consult Bloom’s Revised Taxonomy (below) for more details. For additional sample action verbs, consult this list from The Centre for Learning, Innovation & Simulation at The Michener Institute of Education at UNH.

2. Follow with a Statement

  • Identify and summarize the important feature of major periods in the history of western culture
  • Apply important chemical concepts and principles to draw conclusions about chemical reactions
  • Demonstrate knowledge about the significance of current research in the field of psychology by writing a research paper
  • Length – Should be no more than 400 characters.

*Note: Any special characters (e.g., accents, umlats, ampersands, etc.) and formatting (e.g., bullets, dashes, numbering, etc.) will need to be removed when submitting learning outcomes through HelioCampus Assessment and Credentialing (formerly AEFIS) and other digital campus systems.

Revised Bloom’s Taxonomy of Learning: The “Cognitive” Domain

Graphic depiction of Revised Bloom's Taxonomy

To the right: find a sampling of verbs that represent learning at each level. Find additional action verbs .

*Text adapted from: Bloom, B.S. (Ed.) 1956. Taxonomy of Educational Objectives: The classification of educational goals. Handbook 1, Cognitive Domain. New York.

Anderson, L.W. (Ed.), Krathwohl, D.R. (Ed.), Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich, P.R., Raths, J., & Wittrock, M.C. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s Taxonomy of Educational Objectives (Complete edition). New York: Longman.

Examples of Learning Outcomes

Academic program learning outcomes.

The following examples of academic program student learning outcomes come from a variety of academic programs across campus, and are organized in four broad areas: 1) contextualization of knowledge; 2) praxis and technique; 3) critical thinking; and, 4) research and communication.

Student learning outcomes for each UW-Madison undergraduate and graduate academic program can be found in Guide . Click on the program of your choosing to find its designated learning outcomes.

This is an accordion element with a series of buttons that open and close related content panels.

Contextualization of Knowledge

Students will…

  • identify, formulate and solve problems using appropriate information and approaches.
  • demonstrate their understanding of major theories, approaches, concepts, and current and classical research findings in the area of concentration.
  • apply knowledge of mathematics, chemistry, physics, and materials science and engineering principles to materials and materials systems.
  • demonstrate an understanding of the basic biology of microorganisms.

Praxis and Technique

  • utilize the techniques, skills and modern tools necessary for practice.
  • demonstrate professional and ethical responsibility.
  • appropriately apply laws, codes, regulations, architectural and interiors standards that protect the health and safety of the public.

Critical Thinking

  • recognize, describe, predict, and analyze systems behavior.
  • evaluate evidence to determine and implement best practice.
  • examine technical literature, resolve ambiguity and develop conclusions.
  • synthesize knowledge and use insight and creativity to better understand and improve systems.

Research and Communication

  • retrieve, analyze, and interpret the professional and lay literature providing information to both professionals and the public.
  • propose original research: outlining a plan, assembling the necessary protocol, and performing the original research.
  • design and conduct experiments, and analyze and interpret data.
  • write clear and concise technical reports and research articles.
  • communicate effectively through written reports, oral presentations and discussion.
  • guide, mentor and support peers to achieve excellence in practice of the discipline.
  • work in multi-disciplinary teams and provide leadership on materials-related problems that arise in multi-disciplinary work.

Course Learning Outcomes

  • identify, formulate and solve integrative chemistry problems. (Chemistry)
  • build probability models to quantify risks of an insurance system, and use data and technology to make appropriate statistical inferences. (Actuarial Science)
  • use basic vector, raster, 3D design, video and web technologies in the creation of works of art. (Art)
  • apply differential calculus to model rates of change in time of physical and biological phenomena. (Math)
  • identify characteristics of certain structures of the body and explain how structure governs function. (Human Anatomy lab)
  • calculate the magnitude and direction of magnetic fields created by moving electric charges. (Physics)

Additional Resources

  • Bloom’s Taxonomy
  • The Six Facets of Understanding – Wiggins, G. & McTighe, J. (2005). Understanding by Design (2nd ed.). ASCD
  • Taxonomy of Significant Learning – Fink, L.D. (2003). A Self-Directed Guide to Designing Courses for Significant Learning. Jossey-Bass
  • College of Agricultural & Life Sciences Undergraduate Learning Outcomes
  • College of Letters & Science Undergraduate Learning Outcomes
  • Skip to Content
  • Skip to Main Navigation
  • Skip to Search

critical thinking learning outcomes writing

Indiana University Indianapolis Indiana University Indianapolis IU Indianapolis

Open Search

  • Center Directory
  • Hours, Location, & Contact Info
  • Course (Re)Design Institute for Student Success
  • Plater-Moore Conference on Teaching and Learning
  • Teaching Foundations Webinar Series
  • Associate Faculty Development
  • Early Career Teaching Academy
  • Faculty Fellows Program
  • Graduate Student and Postdoc Teaching Development
  • Awardees' Expectations
  • Request for Proposals
  • Proposal Writing Guidelines
  • Support Letter
  • Proposal Review Process and Criteria
  • Support for Developing a Proposal
  • Download the Budget Worksheet
  • CEG Travel Grant
  • Albright and Stewart
  • Bayliss and Fuchs
  • Glassburn and Starnino
  • Rush Hovde and Stella
  • Mithun and Sankaranarayanan
  • Hollender, Berlin, and Weaver
  • Rose and Sorge
  • Dawkins, Morrow, Cooper, Wilcox, and Rebman
  • Wilkerson and Funk
  • Vaughan and Pierce
  • CEG Scholars
  • Broxton Bird
  • Jessica Byram
  • Angela and Neetha
  • Travis and Mathew
  • Kelly, Ron, and Jill
  • Allison, David, Angela, Priya, and Kelton
  • Pamela And Laura
  • Tanner, Sally, and Jian Ye
  • Mythily and Twyla
  • Learning Environments Grant
  • Extended Reality Initiative(XRI)
  • Champion for Teaching Excellence Award
  • Feedback on Teaching
  • Consultations
  • Equipment Loans
  • Quality Matters@IU
  • To Your Door Workshops
  • Support for DEI in Teaching
  • IU Teaching Resources
  • Just-In-Time Course Design
  • Teaching Online
  • Description and Purpose
  • Examples Repository
  • Submit Examples
  • Using the Taxonomy
  • Scholarly Teaching Growth Survey
  • The Forum Network
  • Media Production Spaces
  • CTL Happenings Archive
  • Recommended Readings Archive

Center for Teaching and Learning

  • Preparing to Teach

Writing and Assessing Student Learning Outcomes

By the end of a program of study, what do you want students to be able to do? How can your students demonstrate the knowledge the program intended them to learn? Student learning outcomes are statements developed by faculty that answer these questions. Typically, Student learning outcomes (SLOs) describe the knowledge, skills, attitudes, behaviors or values students should be able to demonstrate at the end of a program of study. A combination of methods may be used to assess student attainment of learning outcomes.

Characteristics of Student Learning Outcomes (SLOs)

  • Describe what students should be able to demonstrate, represent or produce upon completion of a program of study (Maki, 2010)

A diagram related to learning outcomes and action verbs. The content includes sample learning outcomes and demonstrated learning actions.

Student learning outcomes also:

  • Should align with the institution’s curriculum and co-curriculum outcomes (Maki, 2010)
  • Should be collaboratively authored and collectively accepted (Maki, 2010)
  • Should incorporate or adapt professional organizations outcome statements when they exist (Maki, 2010)
  • Can be quantitatively and/or qualitatively assessed during a student’s studies (Maki, 2010)

Examples of Student Learning Outcomes

The following examples of student learning outcomes are too general and would be very hard to measure : (T. Banta personal communication, October 20, 2010)

  • will appreciate the benefits of exercise science.
  • will understand the scientific method.
  • will become familiar with correct grammar and literary devices.
  • will develop problem-solving and conflict resolution skills.

The following examples, while better are still general and again would be hard to measure. (T. Banta personal communication, October 20, 2010)

  • will appreciate exercise as a stress reduction tool.
  • will apply the scientific method in problem solving.
  • will demonstrate the use of correct grammar and various literary devices.
  • will demonstrate critical thinking skills, such as problem solving as it relates to social issues.

The following examples are specific examples and would be fairly easy to measure when using the correct assessment measure: (T. Banta personal communication, October 20, 2010)

  • will explain how the science of exercise affects stress.
  • will design a grounded research study using the scientific method.
  • will demonstrate the use of correct grammar and various literary devices in creating an essay.
  • will analyze and respond to arguments about racial discrimination.

Importance of Action Verbs and Examples from Bloom’s Taxonomy

  • Action verbs result in overt behavior that can be observed and measured (see list below).
  • Verbs that are unclear, and verbs that relate to unobservable or unmeasurable behaviors, should be avoided (e.g., appreciate, understand, know, learn, become aware of, become familiar with). View Bloom’s Taxonomy Action Verbs

Assessing SLOs

Instructors may measure student learning outcomes directly, assessing student-produced artifacts and performances; instructors may also measure student learning indirectly, relying on students own perceptions of learning.

Direct Measures of Assessment

Direct measures of student learning require students to demonstrate their knowledge and skills. They provide tangible, visible and self-explanatory evidence of what students have and have not learned as a result of a course, program, or activity (Suskie, 2004; Palomba & Banta, 1999). Examples of direct measures include:

  • Objective tests
  • Presentations
  • Classroom assignments

This example of a Student Learning Outcome (SLO) from psychology could be assessed by an essay, case study, or presentation: Students will analyze current research findings in the areas of physiological psychology, perception, learning, abnormal and social psychology.

Indirect Measures of Assessment

Indirect measures of student learning capture students’ perceptions of their knowledge and skills; they supplement direct measures of learning by providing information about how and why learning is occurring. Examples of indirect measures include:

  • Self assessment
  • Peer feedback
  • End of course evaluations
  • Questionnaires
  • Focus groups
  • Exit interviews

Using the SLO example from above, an instructor could add questions to an end-of-course evaluation asking students to self-assess their ability to analyze current research findings in the areas of physiological psychology, perception, learning, abnormal and social psychology. Doing so would provide an indirect measure of the same SLO.

  • Balances the limitations inherent when using only one method (Maki, 2004).
  • Provides students the opportunity to demonstrate learning in an alternative way (Maki, 2004).
  • Contributes to an overall interpretation of student learning at both institutional and programmatic levels.
  • Values the many ways student learn (Maki, 2004).

Bloom, B. (1956) A taxonomy of educational objectives, The classification of educational goals-handbook I: Cognitive domain . New York: McKay .

Maki, P.L. (2004). Assessing for learning: Building a sustainable commitment across the institution . Sterling, VA: Stylus.

Maki, P.L. (2010 ). Assessing for learning: Building a sustainable commitment across the institution (2nd ed.) . Sterling, VA: Stylus.

Palomba, C.A., & Banta, T.W. (1999). Assessment essentials: Planning, implementing, and improving assessment in higher education . San Francisco: Jossey-Bass.

Suskie, L. (2004). Assessing student learning: A common sense guide. Bolton, MA: Anker Publishing.

Revised by Doug Jerolimov (April, 2016)

Helpful Links

  • Revise Bloom's Taxonomy Action Verbs
  • Fink's Taxonomy

Related Guides

  • Creating a Syllabus
  • Assessing Student Learning Outcomes

Recommended Books

Book cover of Assessing for Learning by Peggy L. Maki

Center for Teaching and Learning social media channels

Creating Learning Outcomes

Main navigation.

A learning outcome is a concise description of what students will learn and how that learning will be assessed. Having clearly articulated learning outcomes can make designing a course, assessing student learning progress, and facilitating learning activities easier and more effective. Learning outcomes can also help students regulate their learning and develop effective study strategies.

Defining the terms

Educational research uses a number of terms for this concept, including learning goals, student learning objectives, session outcomes, and more. 

In alignment with other Stanford resources, we will use learning outcomes as a general term for what students will learn and how that learning will be assessed. This includes both goals and objectives. We will use learning goals to describe general outcomes for an entire course or program. We will use learning objectives when discussing more focused outcomes for specific lessons or activities.

For example, a learning goal might be “By the end of the course, students will be able to develop coherent literary arguments.” 

Whereas a learning objective might be, “By the end of Week 5, students will be able to write a coherent thesis statement supported by at least two pieces of evidence.”

Learning outcomes benefit instructors

Learning outcomes can help instructors in a number of ways by:

  • Providing a framework and rationale for making course design decisions about the sequence of topics and instruction, content selection, and so on.
  • Communicating to students what they must do to make progress in learning in your course.
  • Clarifying your intentions to the teaching team, course guests, and other colleagues.
  • Providing a framework for transparent and equitable assessment of student learning. 
  • Making outcomes concerning values and beliefs, such as dedication to discipline-specific values, more concrete and assessable.
  • Making inclusion and belonging explicit and integral to the course design.

Learning outcomes benefit students 

Clearly, articulated learning outcomes can also help guide and support students in their own learning by:

  • Clearly communicating the range of learning students will be expected to acquire and demonstrate.
  • Helping learners concentrate on the areas that they need to develop to progress in the course.
  • Helping learners monitor their own progress, reflect on the efficacy of their study strategies, and seek out support or better strategies. (See Promoting Student Metacognition for more on this topic.)

Choosing learning outcomes

When writing learning outcomes to represent the aims and practices of a course or even a discipline, consider:

  • What is the big idea that you hope students will still retain from the course even years later?
  • What are the most important concepts, ideas, methods, theories, approaches, and perspectives of your field that students should learn?
  • What are the most important skills that students should develop and be able to apply in and after your course?
  • What would students need to have mastered earlier in the course or program in order to make progress later or in subsequent courses?
  • What skills and knowledge would students need if they were to pursue a career in this field or contribute to communities impacted by this field?
  • What values, attitudes, and habits of mind and affect would students need if they are to pursue a career in this field or contribute to communities impacted by this field?
  • How can the learning outcomes span a wide range of skills that serve students with differing levels of preparation?
  • How can learning outcomes offer a range of assessment types to serve a diverse student population?

Use learning taxonomies to inform learning outcomes

Learning taxonomies describe how a learner’s understanding develops from simple to complex when learning different subjects or tasks. They are useful here for identifying any foundational skills or knowledge needed for more complex learning, and for matching observable behaviors to different types of learning.

Bloom’s Taxonomy

Bloom’s Taxonomy is a hierarchical model and includes three domains of learning: cognitive, psychomotor, and affective. In this model, learning occurs hierarchically, as each skill builds on previous skills towards increasingly sophisticated learning. For example, in the cognitive domain, learning begins with remembering, then understanding, applying, analyzing, evaluating, and lastly creating. 

Taxonomy of Significant Learning

The Taxonomy of Significant Learning is a non-hierarchical and integral model of learning. It describes learning as a meaningful, holistic, and integral network. This model has six intersecting domains: knowledge, application, integration, human dimension, caring, and learning how to learn. 

See our resource on Learning Taxonomies and Verbs for a summary of these two learning taxonomies.

How to write learning outcomes

Writing learning outcomes can be made easier by using the ABCD approach. This strategy identifies four key elements of an effective learning outcome:

Consider the following example: Students (audience) , will be able to label and describe (behavior) , given a diagram of the eye at the end of this lesson (condition) , all seven extraocular muscles, and at least two of their actions (degree) .

Audience 

Define who will achieve the outcome. Outcomes commonly include phrases such as “After completing this course, students will be able to...” or “After completing this activity, workshop participants will be able to...”

Keeping your audience in mind as you develop your learning outcomes helps ensure that they are relevant and centered on what learners must achieve. Make sure the learning outcome is focused on the student’s behavior, not the instructor’s. If the outcome describes an instructional activity or topic, then it is too focused on the instructor’s intentions and not the students.

Try to understand your audience so that you can better align your learning goals or objectives to meet their needs. While every group of students is different, certain generalizations about their prior knowledge, goals, motivation, and so on might be made based on course prerequisites, their year-level, or majors. 

Use action verbs to describe observable behavior that demonstrates mastery of the goal or objective. Depending on the skill, knowledge, or domain of the behavior, you might select a different action verb. Particularly for learning objectives which are more specific, avoid verbs that are vague or difficult to assess, such as “understand”, “appreciate”, or “know”.

The behavior usually completes the audience phrase “students will be able to…” with a specific action verb that learners can interpret without ambiguity. We recommend beginning learning goals with a phrase that makes it clear that students are expected to actively contribute to progressing towards a learning goal. For example, “through active engagement and completion of course activities, students will be able to…”

Example action verbs

Consider the following examples of verbs from different learning domains of Bloom’s Taxonomy . Generally speaking, items listed at the top under each domain are more suitable for advanced students, and items listed at the bottom are more suitable for novice or beginning students. Using verbs and associated skills from all three domains, regardless of your discipline area, can benefit students by diversifying the learning experience. 

For the cognitive domain:

  • Create, investigate, design
  • Evaluate, argue, support
  • Analyze, compare, examine
  • Solve, operate, demonstrate
  • Describe, locate, translate
  • Remember, define, duplicate, list

For the psychomotor domain:

  • Invent, create, manage
  • Articulate, construct, solve
  • Complete, calibrate, control
  • Build, perform, execute
  • Copy, repeat, follow

For the affective domain:

  • Internalize, propose, conclude
  • Organize, systematize, integrate
  • Justify, share, persuade
  • Respond, contribute, cooperate
  • Capture, pursue, consume

Often we develop broad goals first, then break them down into specific objectives. For example, if a goal is for learners to be able to compose an essay, break it down into several objectives, such as forming a clear thesis statement, coherently ordering points, following a salient argument, gathering and quoting evidence effectively, and so on.

State the conditions, if any, under which the behavior is to be performed. Consider the following conditions:

  • Equipment or tools, such as using a laboratory device or a specified software application.
  • Situation or environment, such as in a clinical setting, or during a performance.
  • Materials or format, such as written text, a slide presentation, or using specified materials.

The level of specificity for conditions within an objective may vary and should be appropriate to the broader goals. If the conditions are implicit or understood as part of the classroom or assessment situation, it may not be necessary to state them. 

When articulating the conditions in learning outcomes, ensure that they are sensorily and financially accessible to all students.

Degree 

Degree states the standard or criterion for acceptable performance. The degree should be related to real-world expectations: what standard should the learner meet to be judged proficient? For example:

  • With 90% accuracy
  • Within 10 minutes
  • Suitable for submission to an edited journal
  • Obtain a valid solution
  • In a 100-word paragraph

The specificity of the degree will vary. You might take into consideration professional standards, what a student would need to succeed in subsequent courses in a series, or what is required by you as the instructor to accurately assess learning when determining the degree. Where the degree is easy to measure (such as pass or fail) or accuracy is not required, it may be omitted.

Characteristics of effective learning outcomes

The acronym SMART is useful for remembering the characteristics of an effective learning outcome.

  • Specific : clear and distinct from others.
  • Measurable : identifies observable student action.
  • Attainable : suitably challenging for students in the course.
  • Related : connected to other objectives and student interests.
  • Time-bound : likely to be achieved and keep students on task within the given time frame.

Examples of effective learning outcomes

These examples generally follow the ABCD and SMART guidelines. 

Arts and Humanities

Learning goals.

Upon completion of this course, students will be able to apply critical terms and methodology in completing a written literary analysis of a selected literary work.

At the end of the course, students will be able to demonstrate oral competence with the French language in pronunciation, vocabulary, and language fluency in a 10 minute in-person interview with a member of the teaching team.

Learning objectives

After completing lessons 1 through 5, given images of specific works of art, students will be able to identify the artist, artistic period, and describe their historical, social, and philosophical contexts in a two-page written essay.

By the end of this course, students will be able to describe the steps in planning a research study, including identifying and formulating relevant theories, generating alternative solutions and strategies, and application to a hypothetical case in a written research proposal.

At the end of this lesson, given a diagram of the eye, students will be able to label all of the extraocular muscles and describe at least two of their actions.

Using chemical datasets gathered at the end of the first lab unit, students will be able to create plots and trend lines of that data in Excel and make quantitative predictions about future experiments.

  • How to Write Learning Goals , Evaluation and Research, Student Affairs (2021).
  • SMART Guidelines , Center for Teaching and Learning (2020).
  • Learning Taxonomies and Verbs , Center for Teaching and Learning (2021).

critical thinking learning outcomes writing

Writing to Think: Critical Thinking and the Writing Process

“Writing is thinking on paper.” (Zinsser, 1976, p. vii)

Google the term “critical thinking.” How many hits are there? On the day this tutorial was completed, Google found about 65,100,000 results in 0.56 seconds. That’s an impressive number, and it grows more impressively large every day. That’s because the nation’s educators, business leaders, and political representatives worry about the level of critical thinking skills among today’s students and workers.

What is Critical Thinking?

Simply put, critical thinking is sound thinking. Critical thinkers work to delve beneath the surface of sweeping generalizations, biases, clichés, and other quick observations that characterize ineffective thinking. They are willing to consider points of view different from their own, seek and study evidence and examples, root out sloppy and illogical argument, discern fact from opinion, embrace reason over emotion or preference, and change their minds when confronted with compelling reasons to do so. In sum, critical thinkers are flexible thinkers equipped to become active and effective spouses, parents, friends, consumers, employees, citizens, and leaders. Every area of life, in other words, can be positively affected by strong critical thinking.

Released in January 2011, an important study of college students over four years concluded that by graduation “large numbers [of American undergraduates] didn’t learn the critical thinking, complex reasoning and written communication skills that are widely assumed to be at the core of a college education” (Rimer, 2011, para. 1). The University designs curriculum, creates support programs, and hires faculty to help ensure you won’t be one of the students “[showing]no significant gains in . . . ‘higher order’ thinking skills” (Rimer, 2011, para. 4). One way the University works to help you build those skills is through writing projects.

Writing and Critical Thinking

Say the word “writing” and most people think of a completed publication. But say the word “writing” to writers, and they will likely think of the process of composing. Most writers would agree with novelist E. M. Forster, who wrote, “How can I know what I think until I see what I say?” (Forster, 1927, p. 99). Experienced writers know that the act of writing stimulates thinking.

Inexperienced and experienced writers have very different understandings of composition. Novice writers often make the mistake of believing they have to know what they’re going to write before they can begin writing. They often compose a thesis statement before asking questions or conducting research. In the course of their reading, they might even disregard material that counters their pre-formed ideas. This is not writing; it is recording.

In contrast, experienced writers begin with questions and work to discover many different answers before settling on those that are most convincing. They know that the act of putting words on paper or a computer screen helps them invent thought and content. Rather than trying to express what they already think, they express what the act of writing leads them to think as they put down words. More often than not, in other words, experienced writers write their way into ideas, which they then develop, revise, and refine as they go.

What has this notion of writing to do with critical thinking? Everything.

Consider the steps of the writing process: prewriting, outlining, drafting, revising, editing, seeking feedback, and publishing. These steps are not followed in a determined or strict order; instead, the effective writer knows that as they write, it may be necessary to return to an earlier step. In other words, in the process of revision, a writer may realize that the order of ideas is unclear. A new outline may help that writer re-order details. As they write, the writer considers and reconsiders the effectiveness of the work.

The writing process, then, is not just a mirror image of the thinking process: it is the thinking process. Confronted with a topic, an effective critical thinker/writer

  • asks questions
  • seeks answers
  • evaluates evidence
  • questions assumptions
  • tests hypotheses
  • makes inferences
  • employs logic
  • draws conclusions
  • predicts readers’ responses
  • creates order
  • drafts content
  • seeks others’ responses
  • weighs feedback
  • criticizes their own work
  • revises content and structure
  • seeks clarity and coherence

Example of Composition as Critical Thinking

“Good writing is fueled by unanswerable questions” (Lane, 1993, p. 15).

Imagine that you have been asked to write about a hero or heroine from history. You must explain what challenges that individual faced and how they conquered them. Now imagine that you decide to write about Rosa Parks and her role in the modern Civil Rights movement. Take a moment and survey what you already know. She refused to get up out of her seat on a bus so a White man could sit in it. She was arrested. As a result, Blacks in Montgomery protested, influencing the Montgomery Bus Boycott. Martin Luther King, Jr. took up leadership of the cause, and ultimately a movement was born.

Is that really all there is to Rosa Parks’s story? What questions might a thoughtful writer ask? Here a few:

  • Why did Rosa Parks refuse to get up on that particular day?
  • Was hers a spontaneous or planned act of defiance?
  • Did she work? Where? Doing what?
  • Had any other Black person refused to get up for a White person?
  • What happened to that individual or those individuals?
  • Why hadn’t that person or those persons received the publicity Parks did?
  • Was Parks active in Civil Rights before that day?
  • How did she learn about civil disobedience?

Even just these few questions could lead to potentially rich information.

Factual information would not be enough, however, to satisfy an assignment that asks for an interpretation of that information. The writer’s job for the assignment is to convince the reader that Parks was a heroine; in this way the writer must make an argument and support it. The writer must establish standards of heroic behavior. More questions arise:

  • What is heroic action?
  • What are the characteristics of someone who is heroic?
  • What do heroes value and believe?
  • What are the consequences of a hero’s actions?
  • Why do they matter?

Now the writer has even more research and more thinking to do.

By the time they have raised questions and answered them, raised more questions and answered them, and so on, they are ready to begin writing. But even then, new ideas will arise in the course of planning and drafting, inevitably leading the writer to more research and thought, to more composition and refinement.

Ultimately, every step of the way over the course of composing a project, the writer is engaged in critical thinking because the effective writer examines the work as they develop it.

Why Writing to Think Matters

Writing practice builds critical thinking, which empowers people to “take charge of [their] own minds” so they “can take charge of [their] own lives . . . and improve them, bringing them under [their] self command and direction” (Foundation for Critical Thinking, 2020, para. 12). Writing is a way of coming to know and understand the self and the changing world, enabling individuals to make decisions that benefit themselves, others, and society at large. Your knowledge alone – of law, medicine, business, or education, for example – will not be enough to meet future challenges. You will be tested by new unexpected circumstances, and when they arise, the open-mindedness, flexibility, reasoning, discipline, and discernment you have learned through writing practice will help you meet those challenges successfully.

Forster, E.M. (1927).  Aspects of the novel . Harcourt, Brace & Company.

The Foundation for Critical Thinking. (2020, June 17).  Our concept and definition of critical thinking . https://www.criticalthinking.org/pages/our-concept-of-critical-thinking/411

Lane, B. (1993).  After the end: Teaching and learning creative revision . Heinemann.

Rimer, S. (2011, January 18).  Study: Many college students not learning to think critically . The Hechinger Report. https://www.mcclatchydc.com/news/nation-world/national/article24608056.html

Zinsser, W. (1976).  On writing well: The classic guide to writing nonfiction . HarperCollins.

Share this:

  • Click to email a link to a friend (Opens in new window)
  • Click to share on Facebook (Opens in new window)
  • Click to share on Reddit (Opens in new window)
  • Click to share on Twitter (Opens in new window)
  • Click to share on LinkedIn (Opens in new window)
  • Click to share on Pinterest (Opens in new window)
  • Click to print (Opens in new window)

Follow Blog via Email

Enter your email address to follow this blog and receive email notifications of new posts.

Email Address

  • RSS - Posts
  • RSS - Comments
  • COLLEGE WRITING
  • USING SOURCES & APA STYLE
  • EFFECTIVE WRITING PODCASTS
  • LEARNING FOR SUCCESS
  • PLAGIARISM INFORMATION
  • FACULTY RESOURCES
  • Student Webinar Calendar
  • Academic Success Center
  • Writing Center
  • About the ASC Tutors
  • DIVERSITY TRAINING
  • PG Peer Tutors
  • PG Student Access

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

  • College Writing
  • Using Sources & APA Style
  • Learning for Success
  • Effective Writing Podcasts
  • Plagiarism Information
  • Faculty Resources
  • Tutor Training

Twitter feed

learning focused

Learning Through Writing - Strategies for Educators

An increased focus on writing is a catalyst for academic success.

How does Writing improve student outcomes?

In numerous educational settings, the scope of writing is often confined to worksheets, short responses, and the occasional essay assignment. This narrow approach overlooks the profound benefits of integrating writing across all areas of study. Fostering literacy development is a collective endeavor that spans all subjects, with writing acting as a pivotal tool for bolstering students' comprehension skills. There is a clear correlation between regular writing exercises and notable advancements in students' academic performance, underscoring the value of encouraging students to articulate their thoughts in writing.

Increase Critical Thinking through Writing tackles the vital role of writing for enhancing literacy as a comprehensive strategy, facilitating improved understanding across various disciplines. This approach advocates for the integration of engaging writing tasks within every lesson, aiming to augment academic performance by prompting students to reflect on the subject matter and articulate their insights through written expression.

increase critical thinking with writing PD

Key Takeaways

Writing increases learning.

Learn how Scaffolding Strategies offer a dynamic approach to differentiating instruction, ensuring that each student's immediate learning needs are met. Explore how this tailored temporary support enhances understanding, engagement, and overall academic success.

Essential to Tier 1 Instruction

Recognize the dual nature of writing, which not only fosters a solid groundwork for assimilating and understanding new concepts, but also transforms writing into an essential component of the learning and educational process.

Learning Through Writing: 4 Key Writing Principles

Embrace writing as a foundational learning practice structured around four key principles: Integrating Writing Across the Curriculum, Expanding Writing Opportunities, Instructing the Writing Process, and Refining Writing with Constructive Feedback.

Varied Implementation

Commit to a widespread application of writing by focusing on two primary methodologies:

  • Writing to Learn: Reveal how writing serves as an instrument for deepening comprehension and assimilating new information
  • Writing to Inform: Demonstrate the use of writing in effectively conveying information, thereby ensuring clarity and understanding

Increase Reading Comprehension

Learn the symbiotic link between reading and writing, appreciating how writing fortifies literacy capabilities and significantly boosts reading skills.

With the right strategies and tools, your lessons will be revolutionized through the power of writing. Gain practical insights, effective strategies, and an adaptable roadmap for accelerating learning through writing, and the dynamic integration of writing across the curriculum. An increased focus on writing is a catalyst for academic success.

Learning Through Writing - PD and Professional Learning Options

At Your School

At Your School

Schedule a day of Professional Learning at your school. Onsite training with one of our experts provides opportunities for educators to expand their knowledge, refine teaching techniques, and stay up-to-date with the latest curriculum and instruction innovations. By investing in professional learning, your school can cultivate a culture of continuous learning, enhance teacher effectiveness, and ultimately improve student outcomes.

At Our Training Center

Join us in Asheville, North Carolina, at the Learning-Focused Training Center.  With seats limited to 22 attendees, you are guaranteed to have a great experience.

LF Center Workshop 2

At Your Own Pace

Learning at your own pace allows you to tailor your professional learning journey focused on your unique goals and preferences. This option empowers you to set your own pace with the flexibility and autonomy to master new concepts and refine your skills.

Want to learn more? Book a call at your preferred time.

Contact us today to explore how Increase Critical Thinking with Writing can help meet the needs of all students. Let's work together to improve learning through writing, enhance critical thinking, and empower every student to thrive academically!

Increase Critical Thinking Series

If you're exploring ways to transform your teaching methods and enrich the learning experience, don't stop at the Increase Critical Thinking with Writing ! Venture further into our Increase Critical Thinking Professional Learning Series , each uniquely designed to cater to different aspects of teaching and learning through critical thinking strategies .

increase critical thinking with higher order thinking training

  • Teaching Resources: Commonly Asked Questions about Teaching Practices and Educational Technology

Office of Teaching & Learning

Writing Learning Outcomes

The promise of your course.

Many educators feel that a key to effective and efficient course design is to develop and understand the PROMISE of your course. Why should a student take this course? What will they “get out” of it? Will they gain a more critical or informed way of appreciating the world? A set of skills applicable to their future career? Mastery of a set of concepts that are foundational for more advanced learning?

Student on their computer

Students appreciate knowing why they are being asked to learn something. According to Mary Clement’s recent article Three Steps to Better Course Evaluations , “I recommend making invisible expectations explicit. I regularly start class by saying, ‘We are learning this because …’ When students understand why and how the material is relevant to them, they find more motivation to study and end up rating the course more highly.”

If the promise of a course is not clear, it’s often more difficult to articulate learning outcomes that are based on student learning rather than content areas.

For many years, courses were designed based on content areas, by listing the topic areas to be “covered” and breaking that down by the number of class meetings. Today, learning is understood as a much more complex process, and much of the content that used to be available only in the college classroom is now widely available online for free. The classroom is no longer the place to dispense information, but rather the place to help students learn how to use, apply, and understand information.

Today, course design is instead achieved by focusing on student learning goals–what college instructors want their students to learn, to know, and to be able to do by the end of the course. Instructors are asked to focus their course design efforts on what the students are doing in class as much, if not more so, than on what the instructor is doing.

Once the promise of a course is understood and articulated, it is easier to talk about the student learning goals, which are typically written out in the form of learning outcomes.

The task of writing learning outcomes often causes confusion and frustration among faculty members. It can be difficult to articulate in just a few statements all the complex learning that we want to occur in our courses. It is easy to get caught up in the distinctions between terms such as objectives, goals, measurable learning outcomes, or terminal course objectives. However, the bottom line is that it is useful for both instructors and students when the general desired outcomes of a course are stated and shared. Here are some tips and resources about writing learning outcomes.

  • Start with the end in mind. What are the main goals of your course for students? What is it that students should be expected to do, or to know, or to apply, by the end of your course? What are those main concepts you want students to retain years after taking your course?
  • Ultimately, try to write the objectives from the student’s perspective and tie them in to the promise of your course. Rather than just focusing on the content areas, what do you want the students to be able to do, to understand, or to know, and why is it important that they do so?
  • Most courses outcomes consist of a mix of knowledge, skills and attitudes. Think about not only what knowledge students should gain, but what skills they will be developing (critical thinking skills, creative thinking skills, application skills, psychomotor skills, etc.) and what attitudes they might be changing.
  • Clear learning outcomes help you align your content, assignments, and grading practices and help you focus on the essential components of the course, rather than trying to fit everything in.
  • Connect your course outcomes with your programs’ outcomes, with your Graduate degree outcomes, or with the University of Denver Undergraduate Student Outcomes.

Learning Outcome Examples

(Adapted from Walvoord and Anderson, Effective Grading, 1998)

  • Western Civilization I…describe basic historical events and people, argue like a historian does by using historical data as evidence for a position
  • Economics…use economic theory to explain government policies and their effects
  • Physics…explain physical concepts in your own words
  • Speech Pathology…synthesize information from various sources to arrive at intervention tactics for the client

MATC 1200: Calculus for Business and Social Sciences

Generously shared by Deb Carney, Dept of Mathematics

Students should be able to:

  • Relate the concept of the limit to the definition of the derivative
  • Describe the concept of the derivative as an instantaneous rate of change
  • Apply the concepts of the limit and the derivative to solve calculus problems
  • Interpret real-world situations in terms of related calculus concepts
  • Use and apply mathematical models including logarithmic and exponential functions

Additional Resources

Many instructors find the resources below to be helpful when writing learning outcomes.

  • An overview of Bloom’s Taxonomy
  • Description of the Revised Bloom’s Taxonomy with sample verbs
  • a newer taxonomy comes from L. Dee Fink’s Creating Significant Learning Experiences

The main idea is that most courses should focus on more than just knowledge/remembering outcomes and strive to develop more complex thinking skills among students.

Contact the OTL if you would like someone to review and assist you in writing course learning outcomes.

Williams Tower

2150 E. Evans Ave University of Denver Anderson Academic Commons Room 350 Denver, CO 80208

Copyright ©2019 University of Denver Office of Teaching and Learning | All rights reserved | The University of Denver is an equal opportunity affirmative action institution.

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Learning to Improve: Using Writing to Increase Critical Thinking Performance in General Education Biology

  • Ian J. Quitadamo
  • Martha J. Kurtz

*Department of Biological Sciences, Central Washington University, Ellensburg, WA 98926-7537; and

Search for more papers by this author

Department of Chemistry, Central Washington University, Ellensburg, WA 98926-7539

Increasingly, national stakeholders express concern that U.S. college graduates cannot adequately solve problems and think critically. As a set of cognitive abilities, critical thinking skills provide students with tangible academic, personal, and professional benefits that may ultimately address these concerns. As an instructional method, writing has long been perceived as a way to improve critical thinking. In the current study, the researchers compared critical thinking performance of students who experienced a laboratory writing treatment with those who experienced traditional quiz-based laboratory in a general education biology course. The effects of writing were determined within the context of multiple covariables. Results indicated that the writing group significantly improved critical thinking skills whereas the nonwriting group did not. Specifically, analysis and inference skills increased significantly in the writing group but not the nonwriting group. Writing students also showed greater gains in evaluation skills; however, these were not significant. In addition to writing, prior critical thinking skill and instructor significantly affected critical thinking performance, whereas other covariables such as gender, ethnicity, and age were not significant. With improved critical thinking skill, general education biology students will be better prepared to solve problems as engaged and productive citizens.

INTRODUCTION

A national call to improve critical thinking in science.

In the past several years, an increasing number of national reports indicate a growing concern over the effectiveness of higher education teaching practices and the decreased science (and math) performance of U.S. students relative to other industrialized countries ( Project Kaleidoscope, 2006 ). A variety of national stakeholders, including business and educational leaders, politicians, parents, and public agencies, have called for long-term transformation of the K–20 educational system to produce graduates who are well trained in science, can engage intelligently in global issues that require local action, and in general are better able to solve problems and think critically. Specifically, business leaders are calling for graduates who possess advanced analysis and communication skills, for instructional methods that improve lifelong learning, and ultimately for an educational system that builds a nation of innovative and effective thinkers ( Business-Higher Education Forum and American Council on Education, 2003 ). Education leaders are similarly calling for institutions of higher education to produce graduates who think critically, communicate effectively, and who employ lifelong learning skills to address important scientific and civic issues ( Association of American Colleges and Universities, [AACU] 2005 ).

Many college faculty consider critical thinking to be one of the most important indicators of student learning quality. In its 2005 national report, the AACU indicated that 93% of higher education faculty perceived analytical and critical thinking to be an essential learning outcome (AACU, 2005) whereas 87% of undergraduate students indicated that college experiences contributed to their ability to think analytically and creatively. This same AACU report showed that only 6% of undergraduate seniors demonstrated critical thinking proficiency based on Educational Testing Services standardized assessments from 2003 to 2004. During the same time frame, data from the ACT Collegiate Assessment of Academic Proficiency test showed a similar trend, with undergraduates improving their critical thinking less than 1 SD from freshman to senior year. Thus, it appears a discrepancy exists between faculty expectations of critical thinking and students' ability to perceive and demonstrate critical thinking proficiency using standardized assessments (AACU, 2005).

Teaching that supports the development of critical thinking skills has become a cornerstone of nearly every major educational objective since the Department of Education released its six goals for the nation's schools in 1990. In particular, goal three of the National Goals for Education stated that more students should be able to reason, solve problems, and apply knowledge. Goal six specifically stated that college graduates must be able to think critically ( Office of Educational Research and Improvement, 1991 ). Since 1990, American education has tried—with some success—to make a fundamental shift from traditional teacher-focused instruction to more student-centered constructivist learning that encourages discovery, reflection, and in general is thought to improve student critical thinking skill. National science organizations have supported this trend with recommendations to improve the advanced thinking skills that support scientific literacy ( American Association for Higher Education, 1989 ; National Research Council, 1995 ; National Science Foundation, 1996 ).

More recent reports describe the need for improved biological literacy as well as international competitiveness ( Bybee and Fuchs, 2006 ; Klymkowsky, 2006 ). Despite the collective call for enhanced problem solving and critical thinking, educators, researchers, and policymakers are discovering a lack of evidence in existing literature for methods that measurably improve critical thinking skills ( Tsui, 1998 , 2002 ). As more reports call for improved K–20 student performance, it is essential that research-supported teaching and learning practices be used to better help students develop the cognitive skills that underlie effective science learning ( Malcom et al., 2005 ; Bybee and Fuchs, 2006 ).

Critical Thinking

Although they are not always transparent to many college students, the academic and personal benefits of critical thinking are well established; students who can think critically tend to get better grades, are often better able to use reasoning in daily decisions ( U.S. Department of Education, 1990 ), and are generally more employable ( Carnevale and American Society for Training and Development, 1990 ; Holmes and Clizbe, 1997 ; National Academy of Sciences, 2005 ). By focusing on instructional efforts that develop critical thinking skills, it may be possible to increase student performance while satisfying national stakeholder calls for educational improvement and increased ability to solve problems as engaged and productive citizens.

Although academics and business professionals consider critical thinking skill to be a crucial outcome of higher education, many would have difficulty defining exactly what critical thinking is. Historically, there has been little agreement on how to conceptualize critical thinking. Of the literally dozens of definitions that exist, one of the most organized efforts to define (and measure) critical thinking emerged from research done by Peter Facione and others in the early 1990s. Their consensus work, referred to as the Delphi report, was accomplished by a group of 46 leading theorists, teachers, and critical thinking assessment specialists from a variety of academic and business disciplines ( Facione and American Philosophical Association, 1990 ). Initial results from the Delphi report were later confirmed in a national survey and replication study ( Jones et al., 1995 ). In short, the Delphi panel expert consensus describes critical thinking as a “process of purposeful self-regulatory judgment that drives problem-solving and decision-making” ( Facione and American Philosophical Association, 1990 ). This definition implies that critical thinking is an intentional, self-regulated process that provides a mechanism for solving problems and making decisions based on reasoning and logic, which is particularly useful when dealing with issues of national and global significance.

The Delphi conceptualization of critical thinking encompasses several cognitive skills that include: 1) analysis (the ability to break a concept or idea into component pieces in order to understand its structure and inherent relationships), 2) inference (the skills used to arrive at a conclusion by reconciling what is known with what is unknown), and 3) evaluation (the ability to weigh and consider evidence and make reasoned judgments within a given context). Other critical thinking skills that are similarly relevant to science include interpretation, explanation, and self-regulation ( Facione and American Philosophical Association, 1990 ). The concept of critical thinking includes behavioral tendencies or dispositions as well as cognitive skills ( Ennis, 1985 ); these include the tendency to seek truth, to be open-minded, to be analytical, to be orderly and systematic, and to be inquisitive ( Facione and American Philosophical Association, 1990 ). These behavioral tendencies also align closely with behaviors considered to be important in science. Thus, an increased focus on teaching critical thinking may directly benefit students who are engaged in science.

Prior research on critical thinking indicates that students' behavioral dispositions do not change in the short term ( Giancarlo and Facione, 2001 ), but cognitive skills can be developed over a relatively short period of time (Quitadamo, Brahler, and Crouch, unpublished results). In their longitudinal study of behavioral disposition toward critical thinking, Giancarlo and Facione (2001) discovered that undergraduate critical thinking disposition changed significantly after two years. Specifically, significant changes in student tendency to seek truth and confidence in thinking critically occurred during the junior and senior years. Also, females tended to be more open-minded and have more mature judgment than males ( Giancarlo and Facione, 2001 ). Although additional studies are necessary to confirm results from the Giancarlo study, existing research seems to indicate that changes in undergraduate critical thinking disposition are measured in years, not weeks.

In contrast to behavioral disposition, prior research indicates that critical thinking skills can be measurably changed in weeks. In their study of undergraduate critical thinking skill in university science and math courses, Quitadamo, Brahler, and Crouch (unpublished results) showed that critical thinking skills changed within 15 wk in response to Peer Led Team Learning (a national best practice for small group learning). This preliminary study provided some evidence that undergraduate critical thinking skills could be measurably improved within an academic semester, but provided no information about whether critical thinking skills could be changed during a shorter academic quarter. It was also unclear whether the development of critical thinking skills was a function of chronological time or whether it was related to instructional time.

Numerous studies provide anecdotal evidence for pedagogies that improve critical thinking, but much of existing research relies on student self-report, which limits the scope of interpretation. From the literature it is clear that, although critical thinking skills are some of the most valued outcomes of a quality education, additional research investigating the effects of instructional factors on critical thinking performance is necessary ( Tsui, 1998 , 2002 ).

Writing and Critical Thinking

Writing has been widely used as a tool for communicating ideas, but less is known about how writing can improve the thinking process itself ( Rivard, 1994 ; Klein, 2004 ). Writing is thought to be a vehicle for improving student learning ( Champagne and Kouba, 1999 ; Kelly and Chen, 1999 ; Keys, 1999 ; Hand and Prain, 2002 ), but too often is used as a means to regurgitate content knowledge and derive prescribed outcomes ( Keys, 1999 ; Keys et al., 1999 ). Historically, writing is thought to contribute to the development of critical thinking skills ( Kurfiss, and Association for the Study of Higher Education, 1988 ). Applebee (1984) suggested that writing improves thinking because it requires an individual to make his or her ideas explicit and to evaluate and choose among tools necessary for effective discourse. Resnick (1987) stressed that writing should provide an opportunity to think through arguments and that, if used in such a way, could serve as a “cultivator and an enabler of higher order thinking.” Marzano (1991) suggested that writing used as a means to restructure knowledge improves higher-order thinking. In this context, writing may provide opportunity for students to think through arguments and use higher-order thinking skills to respond to complex problems ( Marzano, 1991 ).

Writing has also been used as a strategy to improve conceptual learning. Initial work focused on how the recursive and reflective nature of the writing process contributes to student learning ( Applebee, 1984 ; Langer and Applebee, 1985 , 1987 ; Ackerman, 1993 ). However, conclusions from early writing to learn studies were limited by confounding research designs and mismatches between writing activities and measures of student learning ( Ackerman, 1993 ). Subsequent work has focused on how writing within disciplines helps students to learn content and how to think. Specifically, writing within disciplines is thought to require deeper analytical thinking ( Langer and Applebee, 1987 ), which is closely aligned with critical thinking.

The influence of writing on critical thinking is less defined in science. Researchers have repeatedly called for more empirical investigations of writing in science; however, few provide such evidence ( Rivard, 1994 ; Tsui, 1998 ; Daempfle, 2002 ; Klein, 2004 ). In his extensive review of writing research, Rivard (1994) indicated that gaps in writing research limit its inferential scope, particularly within the sciences. Specifically, Rivard and others indicate that, despite the volume of writing students are asked to produce during their education, they are not learning to use writing to improve their awareness of thinking processes ( Resnick, 1987 ; Howard, 1990 ). Existing studies are limited because writing has been used either in isolation or outside authentic classroom contexts. Factors like gender, ethnicity, and academic ability that are not directly associated with writing but may nonetheless influence its effectiveness have also not been sufficiently accounted for in previous work ( Rivard, 1994 ).

A more recent review by Daempfle (2002) similarly indicates the need for additional research to clarify relationships between writing and critical thinking in science. In his review, Daempfle identified nine empirical studies that generally support the hypothesis that students who experience writing (and other nontraditional teaching methods) have higher reasoning skills than students who experience traditional science instruction. Of the relatively few noninstructional variables identified in those studies, gender and major did not affect critical thinking performance; however, the amount of time spent on and the explicitness of instruction to teach reasoning skills did affect overall critical thinking performance. Furthermore, the use of writing and other nontraditional teaching methods did not appear to negatively affect content knowledge acquisition ( Daempfle, 2002 ). Daempfle justified his conclusions by systematically describing the methodological inconsistencies for each study. Specifically, incomplete sample descriptions, the use of instruments with insufficient validity and reliability, the absence of suitable comparison groups, and the lack of statistical covariate analyses limit the scope and generalizability of existing studies of writing and critical thinking ( Daempfle, 2002 ).

Writing in the Biological Sciences

The conceptual nature and reliance on the scientific method as a means of understanding make the field of biology a natural place to teach critical thinking through writing. Some work has been done in this area, with literature describing various approaches to writing in the biological sciences that range from linked biology and English courses, writing across the biology curriculum, and directed use of writing to improve reasoning in biology courses ( Ebert-May et al., 1997 ; Holyoak, 1998 ; Taylor and Sobota, 1998 ; Steglich, 2000 ; Lawson, 2001 ; Kokkala and Gessell, 2003 ; Tessier, 2006 ). In their work on integrated biology and English, Taylor and Sobota (1998) discussed several problem areas that affected both biology and English students, including anxiety and frustration associated with writing, difficulty expressing thoughts clearly and succinctly, and a tendency to have strong negative responses to writing critique. Although the authors delineate the usefulness of several composition strategies for writing in biology ( Taylor and Sobota, 1998 ), it was unclear whether student data were used to support their recommendations. Kokkala and Gessell (2003) used English students to evaluate articles written by biology students. Biology students first reflected on initial editorial comments made by English students, and then resubmitted their work for an improved grade. In turn, English students had to justify their editorial comments with written work of their own. Qualitative results generated from a list of reflective questions at the end of the writing experience seemed to indicate that both groups of students improved editorial skills and writing logic. However, no formal measures of student editorial skill were collected before biology-English student collaboration, so no definitive conclusions on the usefulness of this strategy could be made.

Taking a slightly different tack, Steglich (2000) informally assessed student attitudes in nonmajors biology courses, and noted that writing produced positive changes in student attitudes toward biology. However, the author acknowledged that this work was not a research study. Finally, Tessier (2006) showed that students enrolled in a nonmajors ecology course significantly improved writing technical skills and committed fewer errors of fact regarding environmental issues in response to a writing treatment. Attitudes toward environmental issues also improved ( Tessier, 2006 ). Although this study surveyed students at the beginning and the end of the academic term and also tracked student progress during the quarter, instrument validity and reliability were not provided. The generalizability of results was further limited because of an overreliance on student self-reports and small sample size.

Each of the studies described above peripherally supports a relationship between writing and critical thinking. Although not explicitly an investigation of critical thinking, results from a relatively recent study support a stronger connection between writing and reasoning ability ( Daempfle, 2002 ). Ebert-May et al. (1997) used a modified learning cycle instructional method and small group collaboration to increase reasoning ability in general education biology students. A quasi-experimental pretest/posttest control group design was used on a comparatively large sample of students, and considerable thought was given to controlling extraneous variables across the treatment and comparison groups. A multifaceted assessment strategy based on writing, standardized tests, and student interviews was used to quantitatively and qualitatively evaluate student content knowledge and thinking skill. Results indicated that students in the treatment group significantly outperformed control group students on reasoning and process skills as indicated by the National Association of Biology Teachers (NABT) content exam. Coincidentally, student content knowledge did not differ significantly between the treatment and control sections, indicating that development of thinking skill did not occur at the expense of content knowledge ( Ebert-May et al., 1997 ). Interview data indicated that students experiencing the writing and collaboration-based instruction changed how they perceived the construction of biological knowledge and how they applied their reasoning skills. Although the Ebert-May study is one of the more complete investigations of writing and critical thinking to date, several questions remain. Supporting validity and reliability data for the NABT test was not included in the study, making interpretation of results somewhat less certain. In addition, the NABT exam is designed to assess high school biology performance, not college performance ( Daempfle, 2002 ). Perhaps more importantly, the NABT exam does not explicitly measure critical thinking skills.

Collectively, it appears that additional research is necessary to establish a more defined relationship between writing and critical thinking in science ( Rivard, 1994 ; Tsui, 1998 , 2002 ; Daempfle, 2002 ). The current study addresses some of the gaps in previous work by evaluating the effects of writing on critical thinking performance using relatively large numbers of students, suitable comparison groups, valid and reliable instruments, a sizable cadre of covariables, and statistical analyses of covariance. This study uses an experimental design similar to that of the Ebert-May et al. (1997) study but incorporates valid and reliable test measures of critical thinking that can be used both within and across different science disciplines.

Purpose of the Study

Currently there is much national discussion about increasing the numbers of students majoring in various science fields ( National Research Council, 2003 ; National Academy of Sciences, 2005 ). Although this is a necessary and worthwhile goal, attention should also be focused on improving student performance in general education science because these students will far outnumber science majors for the foreseeable future. If college instructors want general education students to think critically about science, they will need to use teaching methods that improve student critical thinking performance. In many traditional general education biology courses, students are not expected to work collaboratively, to think about concepts as much as memorize facts, or to develop and support a written thesis or argument. This presents a large problem when one considers the societal role that general education students will play as voters, community members, and global citizens. By improving their critical thinking skills in science, general education students will be better able to deal with the broad scientific, economic, social, and political issues they will face in the future.

Does writing in laboratory affect critical thinking performance in general education biology?

Does the development of analysis, inference, and evaluation skills differ between students who experience writing versus those who experience traditional laboratory instruction?

What measurable effect do factors like gender, ethnicity, and prior thinking skill have on changes in critical thinking in general education biology?

If critical thinking skills change during an academic quarter, when does that take place?

MATERIALS AND METHODS

Study context.

The study took place at a state-funded regional comprehensive university in the Pacific Northwest. All participants were nonmajor undergraduates who were taking biology to satisfy their general education science requirement. Ten total sections of general education biology offered over three academic quarters (one academic year) were included in the study. Four of the 10 sections implemented a writing component during weekly laboratory meetings (N = 158); six traditional quiz-based laboratory sections served as a nonwriting control group (N = 152). Only scores from students who had completed both the initial (pretest) and end-of-quarter (posttest) critical thinking assessments were included in the data analysis. A breakdown of participant demographics for the writing and nonwriting groups is provided in Table 1 .

Demographics for the writing and nonwriting groups

SampleClass distribution (%) Gender distribution (%)
FrSoJrSr2nd SrMF
Writing (158)44.933.515.23.82.538.661.4
No writing (152)53.328.37.29.22.038.261.8
Overall (310)49.031.011.36.52.338.461.6
SampleEthnic distribution (%)
CaucasianHispanicAfrican AmericanNative AmericanAsianOther
Writing (158)84.81.92.504.46.3
No writing (152)81.64.61.31.35.95.3
Overall (310)83.23.21.90.65.25.8

Demographics profile for the study sample. n values in parentheses.

a Other includes the ″choose not to answer″ response.

Each course section included a lecture component offered four times per week for 50 min and a laboratory component that met once a week for 2 h. Course lecture sections were limited to a maximum enrollment of 48 students, with two concurrent lab sections of 24 students. Two different instructors taught five writing sections and five other instructors taught 11 traditional sections over three consecutive quarters. Each course instructor materially participated in teaching laboratory with the help of one graduate assistant per lab section (two graduate students per course section). None of the instructors from treatment sections had implemented writing in the laboratory before the start of this study. Writing instructors were chosen on the basis of personal dissatisfaction with traditional laboratory teaching methods and willingness to try something new.

Strong efforts were made to establish equivalency between writing and nonwriting course sections a priori. Course elements that were highly similar included common lecture rooms, the use of similar (in most cases identical) textbooks, and a lab facility coordinated by a single faculty member. More specifically, three similarly appointed lecture rooms outfitted with contemporary instructional technology including dry erase boards, media cabinets, a networked computer, and digital projection were used to teach the nonmajors biology courses. The same nonmajors biology textbook was used across the writing and most of the nonwriting sections. All laboratory sections used a common lab facility and were taught on the same day of the week. Although the order in which specific labs were taught differed among sections, a common laboratory manual containing prescriptive exercises covering the main themes of biology (scientific method, cellular biology and genetics, natural selection and evolution, kingdoms of life, and a mammalian dissection) was used across all writing and nonwriting lab sections.

Primary course differences included a writing component in the laboratory, and how much time was devoted to laboratory activities. Those sections that experienced the writing treatment completed the prescriptive lab exercises in the first hour and engaged in writing during the second hour of the lab. Nonwriting sections allocated 2 h for the prescriptive lab exercises and included a traditional laboratory quiz rather than a writing assignment. The degree to which the writing and nonwriting sections included small group collaboration in laboratory varied and all course sections differed with regards to individual instructor teaching style. Although all course sections used traditional lecture exams during the quarter to assess content knowledge, the degree to which rote memorization-based exam questions were used to evaluate student learning varied.

Description of the Writing Treatment

On the first day of lecture, students in the writing treatment group were told that their laboratory performance would be evaluated using collaborative essays instead of traditional quizzes. A brief overview of the writing assignments was included in associated course syllabi. During the first laboratory session of the quarter, students were grouped into teams of three or four individuals, and the criteria for completing weekly writing assignments were further explained.

The decision to use collaborative groups to support writing in the laboratory was partly based on existing literature ( Collier, 1980 ; Bruffee, 1984 ; Tobin et al., 1994 ; Jones and Carter, 1998 ; Springer et al., 1999 ) and prior research by Quitadamo, Brahler, and Crouch (unpublished results), who showed that Peer Led Team Learning (one form of collaborative learning) helped to measurably improve undergraduate critical thinking skills. Small group learning was also used in the nonwriting treatment groups to a greater or lesser extent depending on individual instructor preference.

Baseline critical thinking performance was established in the academic quarters preceding the writing experiment to more specifically attribute changes in critical thinking to the writing treatment. Concurrent nonwriting course sections were also used as comparison groups. The historical baseline provided a way to determine what student performance had been before experiencing the writing treatment, whereas the concurrent nonwriting groups allowed for a direct comparison of critical thinking performance during the writing treatment. Pretest scores indicating prior critical thinking skill were also used to further establish comparability between the writing and nonwriting groups.

Laboratory activities were coordinated for all sections by a single faculty member who taught in the nonwriting group. All faculty and graduate assistants met regularly to discuss course progress, laboratory procedure, and coordinate resources. Nonwriting faculty drafted quizzes that addressed laboratory content knowledge. Writing faculty collaboratively crafted a consensus essay, or thought question, designed to elicit student critical thinking and ability to apply content knowledge. Each thought question was designed so that students had to apply lecture concepts and build on their conceptual understanding by integrating actual laboratory experiences (see Supplemental Appendix 1 , available online) for thought question examples). Weekly thought questions became progressively more difficult as the term progressed. Initial planning meetings took place just before the beginning of the academic quarter and included graduate assistant training to help them learn to consistently evaluate student writing using a modified thesis-based essay rubric (see Supplemental Appendix 2 ; Beers et al., 1994 ). A range of sample essays from poor to high quality was used to calibrate graduate assistant scoring and ensure consistency between assistants from different laboratory sections within the writing group. All graduate assistants and course instructors applied the thesis-based rubric to sample essays and worked toward consensus. Initial training ended when all graduate assistants scored within 0.5 points of each other on at least two sample essays.

Students were given weekly thought questions before beginning laboratory to help them frame their efforts during laboratory exercises. Students completed the prescriptive lab activities during the first hour, and then each student group relocated to an assigned computer lab in the same building and worked around a common computer terminal to draft a collective response to the weekly thought question. Students were allowed to use any suitable information or materials (laboratory observations, laboratory manuals, lecture notes, textbooks, the Internet, etc.) to help them address their thought question. Internal group discussions allowed students to argue individual viewpoints as they worked toward group agreement on each thought question. Essay responses to thought questions were answered using a standard five-paragraph format. Each essay included an introduction with a group-generated thesis statement, two to three body paragraphs that provided sufficient detail to support the thesis statement, and a summary paragraph that concluded the essay. Students were not allowed to work on essays outside of the laboratory environment.

Initial essay drafts were composed in Microsoft Word and submitted to the graduate assistant by the end of the laboratory period using the campus e-mail system. Graduate assistants evaluated each group's essay (typically six per lab section) and assigned an initial grade based on the thesis-based essay rubric. Graduate assistants made comments and suggestions electronically using Microsoft Word revising and track changes tools. Evaluated essays were e-mailed back to each student group, which addressed comments and suggestions during the subsequent week's laboratory writing time. Each student group submitted a final draft that was re-evaluated and assigned a final grade. During the second week, students both revised their essay from the previous week and then generated an initial draft for the current week's thought question, all within the lab writing hour. This was done to help students become more proficient writers within a short period of time. Overall, students in the writing group completed eight essays that, along with lab book scores, constituted 25% of their overall course grade. An identical percentage was used to calculate traditional quiz and lab book scores in all nonwriting course sections.

At the end of the quarter, each writing group member completed a peer evaluation for all group members, including themselves (see Supplemental Appendix 3 ). This was done to help students reflect on and evaluate their own performance, maximize individual accountability within the group, and make sure students received credit proportional to their contributions. The average peer evaluation score for each student was included as 5% of the final course grade.

Collectively, this approach to writing and evaluation was used to 1) help students reflect on and discuss deficiencies in their collective and written work, 2) provide an opportunity for students to explicitly address deficiencies in thesis development and general writing skill, 3) provide a suitable reward for student efforts to revise their work relative to established performance benchmarks, 4) improve individual accountability within each group, and 5) help students develop more efficient and effective writing skills that collectively might lead to improved critical thinking skill.

Assessment of Critical Thinking

Using critical thinking to indicate student learning performance is particularly useful because it can be measured within and across disciplines. Various instruments are available to assess critical thinking ( Watson and Glaser, 1980 ; Ennis and Weir, 1985 ; Facione, 1990b ; Center for Critical Thinking and Moral Critique, 1996 ); however, only the CCTST measures cognitive and meta-cognitive skills associated with critical thinking, is based on a consensus definition of critical thinking, and has been evaluated for validity and reliability for measuring critical thinking at the college level ( Facione, 1990a ; Facione et al., 1992 , 2004 ). The CCTST measures cognitive skills of analysis, inference, evaluation, induction, and deduction, with results expressed as raw scores or national percentile equivalents based on a national norming sample of students from 4-yr colleges and universities. Construct validity for the CCTST is high as indicated by greater than 95% consensus of the Delphi panel experts on the component skills of critical thinking. Test reliability (calculated using the KR–20 internal consistency method) is 0.78–0.84 for the form used in this study, a value considered to be within the recommended range for tests that measure a wide range of critical thinking skills ( Facione, 1991 ). The CCTST norming sample for 4-yr colleges and universities is based on a stratified sample of 2000 students from various disciplines, with approximately 30% of the norming sample comprised of science and math students. Approximately 20,000 college students complete the CCTST each year ( Insight Assessment and Blohm, 2005 ).

The CCTST contains 34 questions and is a 45-min timed assessment of critical thinking. An online version of the CCTST was administered in this study, which allowed the researchers to collect student demographics data including gender, ethnicity, age, and several others at the same time critical thinking skill was measured. Total critical thinking skill as well as analysis, inference, and evaluation component critical thinking skills ( Facione, 1990c ) were determined for each CCTST administration and compared across the writing and nonwriting groups.

Research Design

A quasi-experimental pretest/posttest control group design was used for this study to determine whether critical thinking performance in the writing group differed significantly from the nonwriting group. This design was chosen in order to compare critical thinking performance between intact groups, and because it was not feasible to randomly assign students from one course section to another within the sample. Frequency distributions of pretest/posttest changes in total critical thinking skill and analysis, inference, and evaluation component critical thinking skills were constructed to provide some indication of sample randomness and to inform assumptions for subsequent statistical analyses of covariance (see Figure 1 , A–D).

The pretest/posttest control group design was also used in order to minimize internal validity threats that could potentially compete with the effects of the writing treatment on student critical thinking performance. This design is widely used in educational research, and generally controls for most threats to internal validity ( Campbell and Stanley, 1963 ). Internal threats that remain a concern include history, maturation, pretest sensitization, selection, and statistical regression toward the mean. In the current study, history and maturation threats were minimized to the extent that the CCTST pretest and posttest were administered only 9 wk apart, and class standing and age covariables that indicate maturation were included in the statistical analysis. Pretest sensitization and selection are larger concerns for this design. Pretest sensitization was minimized in several ways: 1) prior critical thinking skill indicated by the CCTST pretest was used as a covariable in statistical analyses, 2) pretest/posttest to posttest only comparison studies conducted by Insight Assessment indicate CCTST pretest sensitization is minimized ( Facione, 1990a ), and 3) neither the students, instructors, nor the test administrators have access to the correct answers on the CCTST, so repeat performance on the posttest is less likely. Selection threats were also reduced by using CCTST pretest scores in the statistical analyses, thereby making it more difficult to detect statistically significant differences in critical thinking performance between the writing and nonwriting groups. Statistical regression toward the mean, which was observed to some extent in this study, was minimized because this study used a valid and reliable instrument to assess critical thinking ( Facione, 1990a ). Regression threats were also minimized to the extent that students with higher initial scores regressed much less than students with lower initial scores.

The generalizability of study results is limited because all data were collected at a single university. Specific threats to external validity include selection-treatment interaction and treatment diffusion. These threats were minimized because writing was mandatory for all treatment group participants, thereby minimizing volunteer effects. Because the writing also took considerable student effort, it is less likely that treatment diffusion occurred. In summary, the pretest/posttest control group design was used to minimize internal and external validity threats and maximize the ability to determine the effects of writing on student critical thinking performance.

Study Variables and Data Analysis

Effect of writing on critical thinking performance..

General education biology students were divided into writing and nonwriting groups (independent variable). Changes in CCTST pretest/posttest scores (dependent variable) were determined to discover whether writing influenced student critical thinking performance. Two CCTST outcome measures were used to statistically test for writing effect: 1) raw scores for total critical thinking skill, and 2) raw scores for analysis, inference, and evaluation component skills. Results were reported using raw scores and corresponding national percentile rank so that critical thinking performance outcomes would be more meaningful and intuitive. Conversion of CCTST raw scores to national percentile ranking was done using SPSS (SPSS, Inc., Chicago, IL) statistical software and a linear estimation conversion script based on an equivalency scale from Insight Assessment (Millbrae, CA).

Several covariables were included in the analysis to increase statistical accuracy and precision, and to more specifically isolate the effects of writing on critical thinking performance. CCTST pretest scores were used to indicate initial critical thinking skill. Gender and ethnicity helped to account for male/female or race-specific changes in critical thinking performance and were also used to identify potential sources of performance bias. Academic term and time of day were used to account for critical thinking differences due to the time of year each course was offered and the time of day each student took the course, respectively. Class standing and age were used to indicate maturation related to time in college and chronological age, respectively. Finally, the instructor covariable was used to account for performance differences due to individual teaching styles.

Statistical Analysis of Effect of Writing.

Several statistical analyses were conducted to determine the effects of writing on critical thinking performance in general education biology. An analysis of covariance (ANCOVA) test provided insight regarding differences in overall critical thinking performance between the writing and nonwriting groups. Change in CCTST total raw scores and national percentile ranking was used as composite measures of critical thinking ( Facione, 1990c ) in this initial analysis. Second, changes in particular component critical thinking skills (analysis, inference, and evaluation) were evaluated using a multivariate analysis of covariance (MANCOVA) test because of the three dependent variables. The ANCOVA and MANCOVA tests also provided some insight into the effect the covariables had on critical thinking performance in general education biology. Collectively, these statistical tests allowed for a more accurate and precise analysis because variance associated with the covariables could be more specifically isolated from the writing treatment. Mean, SE, and effect size were also compared between the writing and nonwriting groups. Effect size, represented in standard units, was used to compare the magnitude of writing effect in the study.

Analysis of Thought Question Performance.

Performance on weekly thought questions was analyzed to discover specifically when and how much student critical thinking skills changed during the academic term. This analysis also provided context for CCTST critical thinking performance measures. Specifically, average scores from a representative sample of writing course sections (approximately 100 students) were used to compare initial essay drafts across the weeks of the term to discover when students began to show changes in their first attempt at each essay. Weekly performance on final revised essays was also compared to determine how student final submissions changed over time. Finally, the weekly difference between each initial essay and each final essay was compared to determine how much the revision process changed during the term. These calculations collectively helped to provide a profile of critical thinking performance over time.

Participant Demographics

Student demographics provided in Table 1 indicated an overall distribution of approximately 49% freshmen, 31% sophomores, 11% juniors, and 9% seniors. Approximately 74% of the writing group students were freshmen and sophomores, whereas 82% of the nonwriting group was underclassmen. Overall, 61% of the sample was female and 39% male, with near identical gender distribution across the writing and nonwriting groups. The predominant ethnicity in the sample was Caucasian (>83%), with Asian American (5%), Latino/Hispanic (3%), African American (2%), and Native American (1%) students comprising the remainder of the sample. About 6% of the sample classified themselves as having some other ethnicity or chose not to identify their ethnic heritage.

Statistical Assumptions

Analysis of covariance and multivariate analysis of covariance tests were used to compare critical thinking performance between the writing and nonwriting groups. The evaluated assumptions for the ANCOVA and MANCOVA tests were homogeneity of slopes, homogeneity of covariances, and normality. An analysis evaluating the homogeneity of slopes assumption indicated that the relationship between the covariables and the critical thinking performance dependent variable did not differ significantly by the writing/nonwriting independent variable for the ANCOVA test, F(2, 307) = 1.642, p = 0.195, power = 0.346, partial η 2 = 0.011, or the MANCOVA test, F(6, 610) = 1.685, p = 0.122, power = 0.645, partial η 2 = 0.016. These results confirmed that both analyses of covariance met the homogeneity of slopes assumption. The homogeneity of covariance assumption was tested using Levene's and Box's tests. Levene's test results for the ANCOVA indicated that error variances were not equal across writing and nonwriting groups, F(1,308) = 7.139, p = 0.008. Similarly, Box's test results indicated that covariance was not equal for the writing and nonwriting groups, F(6, 684,530) = 4.628, p = 0.000. These results indicated that the ANCOVA/MANCOVA tests did not meet the homogeneity of covariance assumption. To more fully evaluate this assumption, distributions of total and component critical thinking skill were constructed (see Figure 1 , A–D). Furthermore, the writing and nonwriting groups were highly similar in size and no post hoc tests were conducted. On the basis of these data, it was determined that the ANCOVA and MANCOVA tests were the best statistical measures to answer the research questions. Finally, the normality assumption was evaluated using the previously constructed frequency distributions for total change in critical thinking ( Figure 1 A) as well as change in analysis ( Figure 1 B), inference ( Figure 1 C), and evaluation ( Figure 1 D) critical thinking skills. Frequency distributions of total and component critical thinking dependent variables indicated that each approximated a standard normal curve.

Figure 1.

Figure 1. (A–D) Frequency distribution of change in critical thinking skills. Distribution of change in critical thinking skill for the experimental sample. Changes are indicated using raw scores from CCTST pre- and posttests for total critical thinking skill (A) as well as analysis (B), inference (C), and evaluation (D) component critical thinking skills.

Effect of Writing on Total Critical Thinking Performance

The ANCOVA test of total critical thinking performance showed that writing and nonwriting groups differed significantly, F(1, 300) = 19.357, p < 0.0001, power = 0.992, partial η 2 = 0.061 (see Table 2 ). The strength of the relationship between the writing/nonwriting groups and critical thinking performance was modest but significant, accounting for more than 6% of the variance in critical thinking performance.

ANCOVA results for total critical thinking performance

TreatmentFdf PowerEffect size
Writing19.3573000.000 0.9920.061
CCTST pretest19.7133000.000 0.9930.062
Instructor7.7453000.006 0.7920.025
Time of day6.2913000.013 0.7050.021
Gender0.2263000.6350.0760.001
Ethnicity2.3263000.1280.3300.008
Age0.4533000.5020.1030.002
Class standing0.0023000.9620.0500.000
Academic term2.3873000.1230.3380.008

Analysis of covariance for the writing and nonwriting groups. Tested covariables included gender, ethnicity, class standing, age, prior critical thinking skill (CCTST pre-test), academic term, time of day, and instructor.

a Significance tested at 0.05 level.

Descriptive statistics of total critical thinking performance in the writing and nonwriting groups were also calculated (see Table 3 ). The writing group showed an average CCTST raw score change of 1.18 compared with the nonwriting group, which showed an average raw score change of −0.51. These critical thinking raw scores equated to gains in national percentile rank of 7.47 (45th to 53rd percentile) for the writing group and −2.09 (42nd to 40th percentile) for the nonwriting group. Critical thinking improvement in the writing group was approximately nine times greater than the nonwriting group (see Figure 2 ).

Writing effect on total critical thinking performance: CCTST raw scores

TreatmentMean raw score SEM Raw CT change
PrePostPrePost
Writing (158)15.8417.020.320.361.18
Nonwriting (152)15.4614.950.340.43−0.51
Overall (310)15.6516.000.330.400.34

Comparison of writing and nonwriting group performance based on CCTST raw scores. CCTST raw score range was 0–34; n values in parentheses.

Figure 2.

Figure 2. Effect of writing on total critical thinking national percentile rank. Comparison of total critical thinking national percentile gains between writing and nonwriting groups. Percentile ranking was computed using CCTST raw scores, an equivalency scale from Insight Assessment, and a linear conversion script in SPSS.

The ANCOVA test of total critical thinking skill indicated that gender, ethnicity, age, class standing, and academic term did not significantly affect critical thinking performance (see Table 2 ). Covariables that significantly affected total critical thinking performance included 1) CCTST pretest score, F(1, 300) = 19.713, p < 0.0001, power = 0.993, partial η 2 = 0.062, 2) instructor, F(1, 300) = 7.745, p < 0.006, power = 0.792, partial η 2 = 0.025, and 3) time of day, F(1300) = 6.291, p < 0.013, power = 0.705, partial η 2 = 0.021. The effect of prior critical thinking skill (CCTST pretest) was moderately strong, accounting for more than 6% of the variance in total critical thinking performance. The effect of instructor and time of day were smaller, accounting for 2.5 and 2%, respectively, of total critical thinking performance variance. Critical thinking improvement associated with CCTST pretest score was approximately 2.5 times greater than for instructor and nearly three times greater than for time of day.

Effect of Writing on Component Critical Thinking Performance

The MANCOVA test indicated that analysis, inference, and evaluation critical thinking skills differed significantly between the writing and nonwriting groups, Wilks λ = 0.919, F(3, 296) = 8.746, p < 0.0001, power = 0.995, partial η 2 = 0.081 (see Table 4 ). The strength of the relationship between writing and component critical thinking performance was modest but significant, accounting for more than 8% of the variance in critical thinking performance.

MANCOVA results for component critical thinking performance

TreatmentWilks λFdf PowerEffect size
Writing0.9199.7462960.0000.9950.081
Analysis pretest0.62359.7372960.0001.0000.377
Inference pretest0.68146.2222960.0001.0000.319
Evaluation pretest0.61362.3982960.0001.0000.387
Gender0.9841.6022960.1890.4200.016
Ethnicity0.9831.7562960.1560.4560.017
Age0.9881.1532960.3280.4560.012
Class standing0.9782.1862960.0900.5530.022
Instructor0.9564.5082960.0040.8800.044
Quarter0.9910.8992960.4420.2460.009
Time of day0.9802.0222960.1110.5170.020

Multivariate analysis of covariance for the writing and nonwriting groups. Tested covariables included gender, ethnicity, class standing, age, prior critical thinking skill (CCTST pretest), academic term, time of day, and instructor.

Specifically, significant gains in analysis and inference skills were observed in the writing group but not the nonwriting group. No statistically significant gains in evaluation skill were observed in either group (see Table 5 ). National percentile rank equivalents for CCTST component raw scores indicated the writing group gained 10.51 percentile in analysis skill (42nd to 52nd percentile), 6.05 percentile in inference skill (45th to 52nd percentile), and 5.16 percentile in evaluation skill (46th to 52nd percentile). The nonwriting group showed a national percentile rank change of −4.43 percentile in analysis skill (47th to 42nd percentile), −2.23 percentile in inference skill (42nd to 40th percentile), and 1.37 percentile in evaluation (44th to 45th percentile; see Figure 3 ). Critical thinking performance for the writing group was 15 times greater for analysis and 8 times greater for inference skills than for the nonwriting group. Although neither the writing nor the nonwriting group showed significant gains in evaluation skill, the writing group showed more than 3 times greater improvement than did the nonwriting group.

Effect of writing on component critical thinking performance

Component skillMean raw score and change
Writing (n = 158) Nonwriting (n = 152)
Raw scoreSEMRaw scoreSEM
Analysis (pre)4.220.114.350.11
Analysis (post)4.540.103.990.13
Analysis (change)0.33 0.11−0.360.14
Inference (pre)7.420.187.070.20
Inference (post)7.910.206.830.23
Inference (change)0.48 0.16−0.240.21
Evaluation (pre)4.200.144.040.15
Evaluation (post)4.570.154.130.17
Evaluation (change)0.370.150.090.17

Comparison of writing and nonwriting group performance based on critical thinking component skill raw scores (CCTST subscales). Score range was 0–7 (analysis), 0–16 (inference), and 0–11 (evaluation).

Figure 3.

Figure 3. Effect of writing on component critical thinking national percentile rank. Comparison of component critical thinking national percentile gains between writing and nonwriting groups. Percentile ranking was computed using CCTST raw scores, an equivalency scale from Insight Assessment, and a linear conversion script in SPSS.

The MANCOVA test of analysis, inference, and evaluation skills indicated that gender, ethnicity, age, class standing, academic term, and time of day did not significantly affect critical thinking performance. Critical thinking performance was affected by prior analysis, inference, and evaluation skill (CCTST component pretest scores) and instructor (see Table 4 ). Specifically, component pretest scores had a large effect on critical thinking, accounting for 38% (analysis), 32% (inference), and 39% (evaluation) of critical thinking performance variance. The effect of instructor was smaller, accounting for 4.4% of variation in critical thinking skill. The effect of prior component critical thinking skill was approximately 4.5 times greater than the effect of writing, and nearly 9 times greater than the effect of instructor.

Student Thought Question Performance

Critical thinking performance on student essays was evaluated by applying a thesis-based essay rubric (see Supplemental Appendix 2 ) on initial submissions and final revised essays. Average weekly performance during the academic term is shown in Figure 4 . A comparison of initial essays indicated that students improved 53.3% from week 1 (average score of 27.9%) to week 7 (average score of 81.2%). A similar comparison of final essays showed that students improved 32.5% from week 1 (average score of 54.1%) to week 7 (average score of 86.6%). The largest changes between initial and final essays occurred in week 1 (change of 26.2%), and decreased each week thereafter (24.8, 23.9, 18.8, 8, 7.8, and 5.4% for weeks 2 through 7, respectively). These results showed that students produced little evidence of critical thinking skill in their writing early in the term, but improved dramatically on both initial and revised essay submissions by the end of the term.

Figure 4.

Figure 4. Profile of change in critical thinking performance in writing group. Comparison of student writing performance on weekly initial and revised essays. Essay scores were derived using a thesis-based critical thinking rubric (see Supplemental Appendix 2 ). Average essay scores were computed across writing sections.

The purpose of this study was to discover whether writing could measurably influence critical thinking performance in general education biology. Results indicated that students from the writing group significantly outperformed their nonwriting peers in both total critical thinking skill and the component critical thinking skills of analysis and inference. The writing and nonwriting groups were highly similar initially and began the academic term with comparable critical thinking ability (45th and 42nd national percentile for writing and nonwriting, respectively). By the end of the term, writing students had improved their critical thinking skill to above the 52nd percentile whereas nonwriting students decreased to below the 40th percentile. In addition to writing, prior critical thinking skill and course instructor significantly affected critical thinking performance, with prior critical thinking skill having the largest effect on critical thinking gains of any variable tested. Further analysis of the writing group showed that the largest gains in critical thinking occurred during the first few weeks of the term, with graduated improvement during the remainder of the term. A comparison of average critical thinking performance on initial essays and revised essays showed that thinking skills improvement was greater on initial essays (53%) than on final essays (33%). Collectively, the results of this study indicated that students who experienced writing in general education biology significantly improved their critical thinking skills.

The covariance analysis that was conducted provided a partial means to separate out the effects of writing, prior critical thinking skill, instructor, and multiple covariables from total and component critical thinking gains. The analysis of total critical thinking skill indicated that writing students changed their critical thinking skill from below the national average to above the national average within an academic quarter, whereas nonwriting students remained below the national average. This observation is important because it shows that students can develop critical thinking skills within a fairly short 9-wk period of time, and that writing can play a role in that process. A similar study showed critical thinking skills improve over 15 wk (Quitadamo, Brahler, and Crouch, unpublished results); however, this study provided no insight into whether critical thinking skills could be changed over a shorter period of time, in a different academic setting, or in response to instructional variables such as writing.

Although critical thinking gains were influenced by writing, they did not appear to be affected by gender, ethnicity, class standing, or age. In fact, statistical results indicated that these variables collectively had a very small effect on critical thinking performance. Gender distribution was nearly identical across the writing and nonwriting groups, and was predominantly female (nearly 62%). Ethnic distribution was also highly similar across the writing and nonwriting groups, but the sampling was largely Caucasian (>84%). Class standing varied a little more across the writing and nonwriting groups, with the sample largely comprised of underclassmen (70%). Although nearly three-quarters of the sample was between 18 and 21 years of age, nearly 10% was over 21, with a fair number of older nontraditional students represented. It is possible that a more diverse sample would have produced different results, or it may be that the individuals participating in this study responded particularly well to writing. Although further investigation of these variables is necessary and important, it was beyond the scope of the current study.

The analysis of component skills provided greater insight into the particular critical thinking skills that students changed in response to writing. Specifically, writing students significantly improved their analysis and inference skills whereas nonwriting students did not. Writing students also improved their evaluation skills much more than nonwriting students, although not significantly. These results indicate that the process of writing helps students develop improved analytical and inference skills. Prior research indicates that the writing to learn strategy is effective because students must conceptually organize and structure their thoughts as well as their awareness of thinking processes ( Langer and Applebee, 1987 ; Ackerman, 1993 ; Holliday, 1994 ; Rivard, 1994 ). More specifically, as students begin to shape their thoughts at the point of construction and continually analyze, review, and clarify meaning through the processes of drafting and revision, they necessarily engage and apply analysis and inference skills ( Klein, 1999 ; Hand and Prain, 2002 ). In this study, the process of writing appears to have influenced critical thinking gains. It also seems likely that writing students experienced a greater cognitive demand than nonwriting students simply because the writing act required them to hypothesize, debate, and persuade ( Rivard, 1994 ; Hand and Prain, 2002 ) rather than memorize as was the case in nonwriting control courses.

Conversely, the lack of any significant change in analysis, inference, or evaluation skills in the nonwriting group indicated that the traditional lab instruction used in the general education biology control courses did not help students develop critical thinking skills. Based on the results of this study, it could be argued that traditional lab instruction actually prevents the development of critical thinking skills, which presents a rather large problem when one considers how frequently these traditional methods are used in general education biology courses. One also has to consider that the critical thinking gains seen in the writing group might also have resulted from the relative absence of traditional lab instruction rather than writing alone. Additional research will be necessary to gain further insight into this question. Either way, changes to the traditional model of lab instruction will be necessary if the goal is to enhance the critical thinking abilities of general education biology students.

The variable that had the largest impact on critical thinking performance gains was prior critical thinking skill. This phenomenon was previously observed by Quitadamo, Brahler, and Crouch (unpublished results) in a related study that investigated the effect of Peer Led Team Learning on critical thinking performance. That study focused on science and math major undergraduate critical thinking performance at a major research university, and found that, in addition to Peer Led Team Learning, prior critical thinking skill significantly influenced critical thinking performance (Quitadamo, Brahler, and Crouch, unpublished results). Specifically, students with the highest prior critical thinking skill showed the largest performance gains, whereas students with low initial skill were at a comparative disadvantage. The fact that prior critical thinking skill also had a large effect on critical thinking performance in this study increases the generalizability of the observation and underscores its importance. Simply put, students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills, not because they lack the cognitive hard-wiring to perform but because they lack the tools to build their knowledge. Is it reasonable or just to expect otherwise comparable students to perform at similar levels when only some of them have the keys for success? If we hope to improve the perception of science in this country, we need to educate people on how to think about important scientific issues, and not simply argue a position based on one school of thought. By helping general education students to develop critical thinking skills, it is hoped that they will be better able to think rationally about science.

The observation that students who come to general education biology with greater critical thinking skills leave with the largest skill gains has important implications for the K–12 school system as well. If a high proportion of students are coming to institutions of higher education lacking critical thinking skills, why are these skills not being explicitly taught in the K–12 system? Ideally, students would learn the foundational tenets of critical thinking at an earlier age, and be able to refine and hone these skills as they progress through the K–20 education system. The results of this study reinforce the idea that students should be explicitly taught critical thinking skills and be expected to practice them as early and often as possible.

Although its effect was smaller than writing or prior critical thinking skill, the instructor variable also played a significant role in student critical thinking performance, accounting for 2.5% of the total variance in critical thinking gains. Determining the particular qualities of each instructor that contributed to student critical thinking success and further separating instructor and writing effects will require additional research. Previous research indicates that teaching style positively influences certain aspects of student learning ( Grasha, 1994 ; Hativa et al., 2001 ; Bain, 2004 ), but the qualities that specifically influence student critical thinking gains have not been sufficiently investigated. Additional research in this area is necessary.

Faculty considering whether to use writing in the laboratory may wonder about how much time and energy it takes to implement, if efforts to change will translate into improved student learning, and how these changes affect disciplinary content. From a practical perspective, implementing writing did not take more time and effort per se; rather, it required faculty to reconceptualize how they spent their instructional time. Instead of individually developing course materials, writing faculty collaborated to a greater extent than nonwriting faculty on course design and assessments that required students to demonstrate their critical thinking skill. Interviews of faculty from the writing and nonwriting groups indicated that writing faculty felt the course was less work because they collaborated with colleagues and because students demonstrated improved thinking skill. Writing faculty generally became more comfortable with the new model after ∼2–3 wk when students began to show observable changes in writing proficiency and critical thinking. Together, collaboration with colleagues and observed gains in critical thinking tended to create a positive feedback loop that helped to sustain writing faculty efforts. In contrast, nonwriting faculty similarly wanted their students to think better but were convinced that traditional methods would be more effective, and so remained closed to change. There were some logistical challenges with writing, like scheduling computer labs where students could draft and revise their weekly essay responses under instructor and teaching assistant supervision. Teaching assistants (and faculty) also needed to be trained on how to evaluate writing using a rubric. Finally, with regards to content coverage, no lecture or laboratory content was killed in order to implement writing because writing and nonwriting students both performed the same lab activities. Collectively, the benefits of using writing in laboratory should encourage faculty who want their students to learn to think critically to give it a try.

Future Directions

This study showed that writing affects student critical thinking skill in a nonmajors biology course, but the results have generated more questions than have been answered. How does writing specifically produce gains in critical thinking performance? What factors influence student prior critical thinking skill? How do instructors specifically influence student gains in critical thinking? Future studies that analyze student essays in more detail would provide greater insight into how writing influences critical thinking skill. Using writing in other nonmajor science courses such as chemistry, geology, or physics could also be done to determine the transferability of this method. Additional studies that investigate student prior critical thinking skill and instructor variables are also necessary. These future studies would further contribute to the knowledge base in this area, and also address some of its identified limitations ( Ebert-May et al., 1997 ; Daempfle, 2002 ). Results from these studies would also increase the generalizability of the results from this study.

CONCLUSIONS

Building on existing research and on the basis of several lines of evidence presented in this study, we conclude that writing positively influences critical thinking performance for general education biology students. Those students with prior critical thinking skill may have a comparative advantage over other general education biology students who have not developed these same skills. To rectify that inequity critical thinking skills should be explicitly taught early and used often during the K–20 academic process. As it appears that particular instructors improve student critical thinking skills more than others, students should be discerning in their choice of instructors if they want to improve their critical thinking skills. Whether writing as a method to improve critical thinking skills will prove useful in other general education science courses will likely depend on a host of factors, but it has potential. Further study of writing in general education science will be necessary to verify these results and discover the breadth and depth of how writing affects critical thinking skill.

ACKNOWLEDGMENTS

We thank Drs. Holly Pinkart, Roberta Soltz, Phil Mattocks, and James Johnson and undergraduate researchers Matthew Brewer, Dayrk Flaugh, Adam Wallace, Colette Watson, Kelly Vincent, and Christine Weller for their valuable contributions to this study. The authors also acknowledge the generous financial support provided by the Central Washington University Office of the Provost and the Office of the Associate Vice President for Undergraduate Studies.

  • Ackerman J. M. ( 1993 ). The promise of writing to learn . Writ. Commun. 10 , (3), 334-370. Google Scholar
  • American Association for the Advancement of Science ( 1989 ). Science for All Americans. A Project 2061 Report on Literacy Goals in Science, Mathematics, and Technology In: Washington, DC. Google Scholar
  • Applebee A. N. ( 1984 ). Writing and reasoning . Rev. Educ. Res. 54 , (4), 577-596. Google Scholar
  • Association of American Colleges Universities ( 2005 ). Liberal Education Outcomes: A Preliminary Report on Student Achievement in College In: Washington, DC. Google Scholar
  • Bain K. ( 2004 ). What the Best College Teachers Do In: Cambridge, MA: Harvard University Press. Google Scholar
  • Beers T., McIssac C., Henderson B., Gainen J. ( 1994 ). Writing: thesis and support scoring guide accessed 25 August 2006 http://www.insightassessment.com/pdf_files/RUB_WTHS.PDF . Google Scholar
  • Bruffee K. A. ( 1984 ). Collaborative learning and the “conversation of mankind.” Coll . Engl. 46 , (7), 635-653. Google Scholar
  • Business-Higher Education Forum, and American Council on Education ( 2003 ). Building a Nation of Learners: The Need for Changes in Teaching and Learning To Meet Global Challenges In: Washington, DC. Google Scholar
  • Bybee R. W., Fuchs B. ( 2006 ). Preparing the 21st century workforce: a new reform in science and technology education . J. Res. Sci. Teach. 43 , (4), 349-352. Google Scholar
  • Campbell D. T., Stanley J. C. ( 1963 ). Experimental and Quasi-experimental Designs for Research In: Boston, MA: Houghton Mifflin Company. Google Scholar
  • American Society for Training Development ( 1990 ). Carnevale A. P Workplace Basics: The Essential Skills Employers Want In: San Francisco, CA: Jossey-Bass. Google Scholar
  • Center for Critical Thinking and Moral Critique ( 1996 ). ICAT Critical Thinking Essay Test In: Rohnert Park, CA: Sonoma State University. Google Scholar
  • Champagne A., Kouba V., Mintzes J., Wandersee J., Novak J. ( 1999 ). Written product as performance measures In: Assessing Science Understanding: A Human Constructivist View , New York: Academic Press, 224-248. Google Scholar
  • Collier K. G. ( 1980 ). Peer-group learning in higher education: the development of higher order skills . Stud. High. Educ. 5 , (1), 55-61. Google Scholar
  • Daempfle P. A. ( 2002 ). Instructional Approaches for the Improvement of Reasoning in Introductory College Biology Courses: A Review of the Research In: New York: U.S. Department of Education. Google Scholar
  • Ebert-May D., Brewer C., Allred S. ( 1997 ). Innovation in large lectures—teaching for active learning . Bioscience 47 , (9), 601-607. Google Scholar
  • Ennis R. H. ( 1985 ). A logical basis for measuring critical thinking skills . Educ. Leadership 43 , (2), 44-48. Google Scholar
  • Ennis R. H., Weir E. ( 1985 ). The Ennis-Weir Critical Thinking Essay Test In: Pacific Grove, CA: Midwest Publications. Google Scholar
  • Facione P. A. ( 1990a ). The California Critical Thinking Skills Test—College Level. Technical Report 1. Experimental Validation and Content Validity In: Millbrae, CA: Insight Assessment. Google Scholar
  • Facione P. A. ( 1990b ). The California Critical Thinking Skills Test—College Level. Technical Report 3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST In: Millbrae, CA: Insight Assessment. Google Scholar
  • Facione P. A. ( 1990c ). The California Critical Thinking Skills Test—College Level. Technical Report 4. Interpreting the CCTST, Group Norms, and Sub-Scores In: Millbrae, CA: Insight Assessment. Google Scholar
  • Facione P. A. ( 1991 ). Using the California Critical Thinking Skills Test in Research, Evaluation, and Assessment In: Millbrae, CA: Insight Assessment. Google Scholar
  • American Philosophical Association ( 1990 ). Facione P. A Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Research Findings and Recommendations In: Millbrae, CA: Insight Assessment. Google Scholar
  • Facione P. A., Facione N. C., Giancarlo C. A. ( 1992 ). Test Manual: The California Critical Thinking Disposition Inventory In: Millbrae, CA: Insight Assessment. Google Scholar
  • Insight Assessment ( 2004 ). Facione P. A, Facione N. C Test of everyday reasoning In: accessed 30 June 2006 http://www.insightassessment.com/test-ter.html . Google Scholar
  • Giancarlo C. A., Facione P. A. ( 2001 ). A look across four years at the disposition toward critical thinking among undergraduate students . J. Gen. Educ. 50 , (1), 29-55. Google Scholar
  • Grasha A. F. ( 1994 ). A matter of style: the teacher as expert, formal authority, personal model, facilitator, and delegator . Coll. Teach. 42 , (4), 142-149. Google Scholar
  • Hand B., Prain V. ( 2002 ). Teachers implementing writing-to-learn strategies in junior secondary science: a case study . Sci. Educ. 86 , (6), 737-755. Google Scholar
  • Hativa N., Barak R., Simhi E. ( 2001 ). Exemplary university teachers: knowledge and beliefs regarding effective teaching dimensions and strategies . J. High. Educ. 72 , (6), 699-729. Google Scholar
  • Holliday W. G. ( 1994 ). The reading-science learning-writing connection: breakthroughs, barriers, and promises . J. Res. Sci. Teach. 31 , (9), 877-893. Google Scholar
  • Holmes J., Clizbe E. ( 1997 ). Facing the 21st century . Bus. Educ. Forum. 52 , (1), 33-35. Google Scholar
  • Holyoak A. R. ( 1998 ). A plan for writing throughout (not just across) the biology curriculum . Am. Biol. Teach. 60 , (3), 186-190. Google Scholar
  • Howard V. A. ( 1990 , Ed. V. A. Howard , Thinking on paper: a philosopher's look at writing In: Varieties of Thinking: Essays from Harvard's Philosophy of Education Research Center , New York: Routledge, 84-92. Google Scholar
  • Insight Assessment ( 2005 ). Blohm S. Annual number of users for the CCTST form 2000 . accessed 8 December 2006 . Google Scholar
  • Jones E. A., Hoffman S., Moore L. M., Ratcliff G., Tibbets S., Click B. ( 1995 ). National Assessment of College Student Learning: Identifying College Graduates' Essential Skills in Writing, Speech and Listening, and Critical Thinking. Final project report In: Report no. NCES-95-001 , University Park, PA: U.S. Department of Education, Office of Educational Research and Improvement.. Google Scholar
  • Jones G. M, Carter G ( 1998 , Ed. J. J. MintzesJ. H. WanderseeJ. D. Novak , Small groups and shared constructions In: Teaching Science for Understanding: A Human Constructivist View , San Diego, CA: Academic Press, 261-279. Google Scholar
  • Kelly G. J., Chen C. ( 1999 ). The sound of music: constructing science as sociocultural practices through oral and written discourse . J. Res. Sci. Teach. 36 , (8), 883-915. Google Scholar
  • Keys C. W. ( 1999 ). Revitalizing instruction in scientific genres: connecting knowledge production with writing to learn in science . Sci. Educ. 83 , (2), 115-130. Google Scholar
  • Keys C. W., Hand B., Prain V., Collins S. ( 1999 ). Using the science writing heuristic as a tool for learning from laboratory investigations in secondary science . J. Res. Sci. Teach. 36 , (10), 1065-1084. Google Scholar
  • Klein P. ( 1999 ). Reopening inquiry into cognitive processes in writing-to-learn . Ed. Psychol. Rev. 11 , (3), 203-270. Google Scholar
  • Klein P. D. ( 2004 ). Constructing scientific explanations through writing . Instr. Sci. 32 , (3), 191-231. Google Scholar
  • Klymkowsky M. W. ( 2006 ). Can nonmajors courses lead to biological literacy? Do majors courses do any better? . Cell. Biol. Educ. 4 , 42-44. Google Scholar
  • Kokkala I., Gessell D. A. ( 2003 ). Writing science effectively: biology and English students in an author-editor relationship . J. Coll. Sci. Teach. 32 , (4), 252-257. Google Scholar
  • Association for the Study of Higher Education ( 1988 ). Kurfiss J. G Critical Thinking: Theory, Research, Practice, and Possibilities In: Washington, DC: George Washington University. Google Scholar
  • Langer J. A., Applebee A. N. ( 1985 ). Learning to write: learning to think . Educ. Horizons 64 , (1), 36-38. Google Scholar
  • Langer J. A., Applebee A. N. ( 1987 ). How Writing Shapes Thinking: A Study of Teaching and Learning. NCTE research report no. 22 In: Urbana, IL: National Council of Teachers of English. Google Scholar
  • Lawson A. E. ( 2001 ). Using the learning cycle to teach biology concepts and reasoning patterns . J. Biol. Educ. 35 , (4), 165-169. Google Scholar
  • Malcom S. M., Abdallah J., Chubin D. E., Grogan K. ( 2005 ). A System of Solutions: Every School, Every Student , Washington, DC: American Association for the Advancement of Science. Google Scholar
  • Marzano R. J. ( 1991 ). Fostering thinking across the curriculum through knowledge restructuring . J. Reading 34 , (7), 518-525. Google Scholar
  • National Academy of Sciences, National Academy of Engineering, Institute of Medicine ( 2005 ). Rising Above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future In: Washington, DC: Committee on Prospering in the Global Economy of the 21st Century. Google Scholar
  • National Research Council ( 1995 ). National Science Education Standards In: Washington, DC: National Academy Press. Google Scholar
  • National Research Council ( 2003 ). Bio 2010, Transforming Undergraduate Education for Future Research Biologists In: Washington, DC: Committee on Undergraduate Biology Education to Prepare Research Scientists for the 21st Century. Google Scholar
  • National Science Foundation ( 1996 ). Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering, and Technology In: Washington, DC: Directorate for Education and Human Resources. Google Scholar
  • Office of Educational Research Improvement ( 1991 ). Striving for excellence: The National Education Goals In: Washington, DC. Google Scholar
  • Project Kaleidoscope. ( 2006 ). Transforming America's Scientific and Technological Infrastructure: Recommendations for Urgent Action In: Washington, DC: National Science Foundation. Google Scholar
  • Resnick L. B. ( 1987 ). Education and Learning To Think , Washington DC: National Academy Press. Google Scholar
  • Rivard L. P. ( 1994 ). A review of writing to learn in science: implications for practice and research . J. Res. Sci. Teach. 31 , (9), 969-983. Google Scholar
  • Springer L., Donovan S. S., Stanne M. E. ( 1999 ). Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: a meta-analysis . Rev. Educ. Res. 69 , (1), 21-51. Google Scholar
  • Steglich C. S. ( 2000 ). A writing assignment that changes attitudes in biology classes . Am. Biol. Teach. 62 , (2), 98-101. Google Scholar
  • Taylor K. L., Sobota S. J. ( 1998 ). Writing in biology: an integration of disciplines . Am. Biol. Teach. 60 , (5), 350-353. Google Scholar
  • Tessier J. ( 2006 ). Writing assignment in a nonmajor introductory ecology class . J. Coll. Sci. Teach. 35 , (4), 25-29. Google Scholar
  • Tobin K. G., Tippins D. J., Gallard A. J. ( 1994 , Ed. D. L. Gabel , Research on instructional strategies for teaching science In: Handbook of Research on Science Teaching and Learning , New York: Macmillan, 45-93. Google Scholar
  • Tsui L. ( 1998 ). A review of research on critical thinking In: ASHE annual meeting paper , Miami, FL 1998 November5-8. Google Scholar
  • Tsui L. ( 2002 ). Fostering critical thinking through effective pedagogy: evidence from four institutional case studies . J. High. Educ. 73 , (6), 740-763. Google Scholar
  • U.S. Department of Education ( 1990 ). National Goals for Education In: Washington, DC. Google Scholar
  • Watson G., Glaser E. M. ( 1980 ). Watson-Glaser Critical Thinking Appraisal , Cleveland, OH: The Psychological Corporation (Harcourt Brace Jovanovich). Google Scholar
  • Investigating the impact of critical thinking instruction on writing performance: A multilevel modelling analysis of relative gain data in the Australian national assessment program 1 Sep 2024 | Thinking Skills and Creativity, Vol. 53
  • Constructing analogies: Developing critical thinking through a collaborative task 8 June 2024 | Biochemistry and Molecular Biology Education, Vol. 33
  • Time to Degree, Identity, and Grant Writing: Lessons Learned From a Mixed-Methods Longitudinal Case Study of Biosciences PhD Students 22 May 2024 | Sage Open, Vol. 14, No. 2
  • Sounding the Alarm on Public Health Education: A Need for Equity-Focused Teaching 28 March 2024 | Pedagogy in Health Promotion, Vol. 18
  • Brave New Words: A Framework and Process for Developing Technology-Use Guidelines for Student Writing 4 March 2024 | Pedagogy in Health Promotion
  • Ten simple rules to leverage large language models for getting grants 1 March 2024 | PLOS Computational Biology, Vol. 20, No. 3
  • “If we can do it, anyone can!”: Evaluating a virtual “Paper Chase” collaborative writing model for rapid research dissemination 3 June 2022 | Active Learning in Higher Education, Vol. 25, No. 1
  • The Online Forum Impact on Student Engagement and Critical Thinking Disposition in General Education 23 Feb 2024
  • Exploring Best Practices for New Climate Change Policy and Law Program 1 Feb 2024 | IOP Conference Series: Earth and Environmental Science, Vol. 1304, No. 1
  • Enhancing Employability Skills of Biology Graduates through an Interdisciplinary Project-Based Service Learning Experience with Engineering and Translation Undergraduate Students 16 January 2024 | Education Sciences, Vol. 14, No. 1
  • Exploring Instructors’ Beliefs about and Practices in Promoting Students’ Critical Thinking Skills in Writing Classes 30 December 2023 | GIST – Education and Learning Research Journal, Vol. 26
  • Evaluation of an 8-week high school science communication course designed to read, write, and present scientific research 1 Dec 2023 | Advances in Physiology Education, Vol. 47, No. 4
  • How to coach student professional development during times of challenges and uncertainties 22 August 2023 | BMC Medical Education, Vol. 23, No. 1
  • Critical thinking skills and science achievement: A latent profile analysis 1 Sep 2023 | Thinking Skills and Creativity, Vol. 49
  • Students Need More than Content Knowledge To Counter Vaccine Hesitancy 22 Aug 2023 | Journal of Microbiology & Biology Education, Vol. 24, No. 2
  • Comparing study features is easy but identifying next steps is hard: Evaluating critical thinking through the Biology Lab Inventory of Critical Thinking in Ecology 10 May 2023 | Ecology and Evolution, Vol. 13, No. 5
  • Teaching Psychopharmacology for Undergraduates 17 December 2022
  • EFL Students’ Perceptions and Challenges of Computer-Mediated Collaborative Writing in Academic Writing Courses at Van Lang University 7 April 2023
  • Julia Gouvea ,
  • Lara Appleby ,
  • Liren Fu , and
  • Vicente Talanquer, Monitoring Editor
  • Effectiveness of a manuscript writing workshop on writing confidence amongst nursing and health disciplines clinicians 1 Dec 2022 | Journal of Medical Imaging and Radiation Sciences, Vol. 53, No. 4
  • Evaluation of Creative Writing Products of Primary School Students According to Some Variables 26 September 2022 | Pamukkale University Journal of Education
  • Writing goals in U.S. undergraduate data science course outlines: A textual analysis 1 August 2022 | Teaching Statistics, Vol. 44, No. 3
  • Writing Is Thinking: Implementation and Evaluation of an Internal Medicine Residency Clinical Reasoning and Documentation Curriculum 13 June 2022 | Medical Science Educator, Vol. 32, No. 4
  • Promoting Computational Thinking Skills in Non-Computer-Science Students: Gamifying Computational Notebooks to Increase Student Engagement 1 Jun 2022 | IEEE Transactions on Learning Technologies, Vol. 15, No. 3
  • Effects of group awareness tools on students' engagement, performance, and perceptions in online collaborative writing: Intergroup information matters 1 Apr 2022 | The Internet and Higher Education, Vol. 53
  • Exploring the structure of students’ scientific higher order thinking in science education 1 Mar 2022 | Thinking Skills and Creativity, Vol. 43
  • Students’ perceptions of mathematics writing and its impact on their enjoyment and self-confidence 11 June 2021 | Teaching Mathematics and its Applications: An International Journal of the IMA, Vol. 41, No. 1
  • Teaching Psychopharmacology for Undergraduates 14 December 2021
  • Dennis Lee ,
  • Mallory Wright ,
  • Courtney Faber ,
  • Cazembe Kennedy , and
  • Dylan Dittrich-Reed
  • Stanley M. Lo, Monitoring Editor
  • Establishing a baseline of science communication skills in an undergraduate environmental science course 23 July 2021 | International Journal of STEM Education, Vol. 8, No. 1
  • Diedra M. Wrighting ,
  • Jamie Dombach ,
  • Mike Walker ,
  • Jenene Cook ,
  • Marlina Duncan ,
  • Gisselle Velez Ruiz ,
  • Adán Colón-Carmona , and
  • Bruce Birren
  • James Hewlett, Monitoring Editor
  • Analysis of university student responses to the pandemic in a formal microbiology assessment 7 July 2021 | FEMS Microbiology Letters, Vol. 368, No. 14
  • A challenge in teaching scientific communication: academic experience does not improve undergraduates’ ability to assess their or their peers’ writing 7 September 2020 | Assessment & Evaluation in Higher Education, Vol. 46, No. 5
  • The effect of practices based on argumentation-based inquiry approach on teacher candidates’ critical thinking 25 August 2019 | Educational Studies, Vol. 47, No. 1
  • Biosciences Proposal Bootcamp: Structured peer and faculty feedback improves trainees’ proposals and grantsmanship self-efficacy 28 December 2020 | PLOS ONE, Vol. 15, No. 12
  • Mini-Review - Teaching Writing in the Undergraduate Neuroscience Curriculum: Its Importance and Best Practices 1 Oct 2020 | Neuroscience Letters, Vol. 737
  • A Search for a Method to Improve Critical Thinking Skills in Social Studies Teaching: Writing-to-Learn 31 August 2020 | Review of International Geographical Education Online
  • Using Poetry in the Undergraduate Biology Classroom 1 Aug 2020 | The American Biology Teacher, Vol. 82, No. 6
  • Assessing Pharmacy Students’ Scientific Reasoning After Completing a Physics Course Taught Using Active-Learning Methods 1 Aug 2020 | American Journal of Pharmaceutical Education, Vol. 84, No. 8
  • Kritikal at inferensyal na kakayahan sa pagbasa ng mga mag-aaral sa larangan ng Filipino ng Visayas State University 14 July 2020 | International Journal of Research Studies in Education, Vol. 9, No. 3
  • Concept map and problem-based learning 1 Jun 2020 | Journal of Physics: Conference Series, Vol. 1567, No. 2
  • What I know Best: Assessing Indonesian Student’s Structural Knowledge through Student-Generated Topics 15 April 2020 | European Journal of Educational Research, Vol. volume-9-2020, No. volume-9-issue-2-april-2020
  • Use of a competency framework to explore the benefits of student-generated multiple-choice questions (MCQs) on student engagement 18 November 2019 | Pedagogies: An International Journal, Vol. 15, No. 2
  • Peculiarities of using of the Moodle test tools in philosophy teaching 20 March 2020 | CTE Workshop Proceedings, Vol. 7
  • Challenging endocrinology students with a critical-thinking workbook 1 Mar 2020 | Advances in Physiology Education, Vol. 44, No. 1
  • Utility value interventions in a college biology lab: The impact on motivation 21 August 2019 | Journal of Research in Science Teaching, Vol. 57, No. 2
  • Teaching Psychopharmacology for Undergraduates 1 June 2020
  • Writing Assignments in Epidemiology Courses: How Many and How Good? 21 May 2019 | Public Health Reports, Vol. 134, No. 4
  • Jason E. Dowd ,
  • Robert J. Thompson ,
  • Leslie Schiff ,
  • Kelaine Haas ,
  • Christine Hohmann ,
  • Chris Roy ,
  • Warren Meck ,
  • John Bruno , and
  • Julie A. Reynolds
  • Rebecca Price, Monitoring Editor
  • Impact of Peer-Led Team Learning and the Science Writing and Workshop Template on the Critical Thinking Skills of First-Year Chemistry Students 28 March 2019 | Journal of Chemical Education, Vol. 96, No. 5
  • The Power of Practice: Adjusting Curriculum to Include Emphasis on Skills 1 September 2023 | Journal of College Science Teaching, Vol. 48, No. 5
  • Exploring the effects of emotional design and emotional intelligence in multimedia-based learning: an engineering educational perspective 26 March 2019 | New Review of Hypermedia and Multimedia, Vol. 25, No. 1-2
  • Boosting Engagement and Intercultural Competence Through Technology 1 Jan 2019
  • Melanie L. Styers ,
  • Peter A. Van Zandt ,, and
  • Katherine L. Hayden
  • Erin L. Dolan, Monitoring Editor
  • Ana Maria Barral ,
  • Veronica C. Ardi-Pastores , and
  • Rachel E. Simmons
  • Jennifer Knight, Monitoring Editor
  • Using writing assignments to promote critical thinking, learning and professional identity: The Epidemiology Workplace Writing Repository 30 January 2018 | Journal of Public Health, Vol. 40, No. 3
  • Writing in the STEM classroom: Faculty conceptions of writing and its role in the undergraduate classroom 7 June 2018 | Science Education, Vol. 102, No. 5
  • Leslie A. Schiff , and
  • John Coley, Monitoring Editor
  • Learning to Write Like a Scientist: A Writing-Intensive Course for Microbiology/Health Science Students 1 Mar 2018 | Journal of Microbiology & Biology Education, Vol. 19, No. 1
  • Stepwise Approach To Writing Journal-Style Lab Reports in the Organic Chemistry Course Sequence 20 November 2017 | Journal of Chemical Education, Vol. 95, No. 1
  • Using Scaffolding and Deliberate Practice to Improve Abstract Writing in an Introductory Biology Laboratory Course 1 Jan 2018 | Journal of Microbiology & Biology Education, Vol. 19, No. 2
  • A Systematic Approach to Teaching Case Studies and Solving Novel Problems 1 Jan 2018 | Journal of Microbiology & Biology Education, Vol. 19, No. 3
  • WEB based technical problem solving for enhancing writing skills of secondary vocational students 22 July 2016 | Education and Information Technologies, Vol. 22, No. 4
  • Childhood predictors of written expression in late adolescents with 22q11.2 deletion syndrome: a longitudinal study 8 March 2017 | Journal of Intellectual Disability Research, Vol. 61, No. 5
  • Effects of shoe heel height on the end-point and joint kinematics of the locomotor system when crossing obstacles of different heights 6 May 2016 | Ergonomics, Vol. 60, No. 3
  • The use of writing assignments to help students synthesize content in upper-level undergraduate biology courses 1 February 2017 | FEMS Microbiology Letters, Vol. 364, No. 3
  • K. F. Weaver ,
  • V. Morales ,
  • M. Nelson ,
  • P. F. Weaver ,
  • A. Toledo , and
  • Kathryn E. Perez, Monitoring Editor
  • Gili Marbach-Ad ,
  • Carly Rietschel , and
  • Katerina V. Thompson
  • A programme-wide training framework to facilitate scientific communication skills development amongst biological sciences Masters students 19 February 2015 | Journal of Further and Higher Education, Vol. 40, No. 4
  • An Assessment of Student Perceptions and Responses to Frequent Low-stakes Testing in Introductory Sociology Classes 26 January 2016 | Teaching Sociology, Vol. 44, No. 2
  • Incorporation of an Explicit Critical-Thinking Curriculum to Improve Pharmacy Students’ Critical-Thinking Skills 1 Apr 2016 | American Journal of Pharmaceutical Education, Vol. 80, No. 3
  • Developing critical thinking skills using the Science Writing Heuristic in the chemistry laboratory 1 January 2016 | Chemistry Education Research and Practice, Vol. 17, No. 1
  • The effect of inference-making, deduction, and self-monitoring on EFL learners' language achievement, reading, and writing ability 18 November 2015 | International Journal of Research Studies in Language Learning, Vol. 5, No. 4
  • Community-Based Inquiry in Allied Health Biochemistry Promotes Equity by Improving Critical Thinking for Women and Showing Promise for Increasing Content Gains for Ethnic Minority Students 27 January 2015 | Journal of Chemical Education, Vol. 92, No. 5
  • How do learners experience joint writing: University students' conceptions of online collaborative writing tasks and environments 1 Mar 2015 | Computers & Education, Vol. 82
  • Critical Thinking and Character 1 Jan 2015
  • Oborové didaktiky: vývoj – stav – perspektivy 1 Jan 2015
  • A Modern Twist on the Beaumont and St. Martin Case: Encouraging Analysis and Discussion in the Bioethics Classroom with Reflective Writing and Concept Mapping 1 Dec 2014 | Journal of Microbiology & Biology Education, Vol. 15, No. 2
  • Michelle Mynlieff ,
  • Anita L. Manogaran ,
  • Martin St. Maurice , and
  • Thomas J. Eddinger
  • Mary Pat Wenderoth, Monitoring Editor
  • Evaluating Biological Claims to Enhance Critical Thinking through Position Statements 1 May 2014 | Journal of Microbiology & Biology Education, Vol. 15, No. 1
  • Jessica Middlemis Maher ,
  • Jonathan C. Markey , and
  • Diane Ebert-May
  • Career-Oriented Performance Tasks in Chemistry: Effects on Students' Critical Thinking Skills 1 Jan 2013 | Education Research International, Vol. 2013
  • The Critical Thinking Analytic Rubric (CTAR): Investigating intra-rater and inter-rater reliability of a scoring mechanism for critical thinking performance assessments 1 Oct 2012 | Assessing Writing, Vol. 17, No. 4
  • Ethics, Politics and Protests: Using Contentious Issues in Reproductive Sciences as Educational Opportunities 25 July 2012 | Reproduction in Domestic Animals, Vol. 47, No. s4
  • Julie Libarkin , and
  • Gabriel Ording
  • Vivian Siegel, Monitoring Editor
  • Meaning Check Questions as an Active Learning Tool in the University Biology Laboratory Assists International Students 1 Jan 2012 | Creative Education, Vol. 03, No. 06
  • The value of writing-to-learn when using question prompts to support web-based learning in ill-structured domains 16 September 2010 | Educational Technology Research and Development, Vol. 59, No. 1
  • Learning surface anatomy: Which learning approach is effective in an integrated PBL curriculum? 23 December 2010 | Medical Teacher, Vol. 33, No. 1
  • Learning with Online Activities 1 Jan 2011
  • An analysis of narratives to identify critical thinking contexts in psychiatric clinical practice 27 January 2010 | International Journal of Nursing Practice, Vol. 16, No. 1
  • Spacing and Testing Effects 1 Jan 2010
  • Interactions Between Students and Tutor in Problem‐Based Learning: The Significance of Deep Learning 25 June 2009 | The Kaohsiung Journal of Medical Sciences, Vol. 25, No. 5
  • Prompting students’ context-generating cognitive activity in ill-structured domains: does the prompting mode affect learning? 8 November 2008 | Educational Technology Research and Development, Vol. 57, No. 2
  • The Impact of Prompting in Technology-Enhanced Learning as Moderated by Students’ Motivation and Metacognitive Skills 1 Jan 2009
  • Ian J. Quitadamo ,
  • Celia L. Faiola ,
  • James E. Johnson , and
  • Deborah Allen, Monitoring Editor
  • Use of Portfolios by Medical Students: Significance of Critical Thinking 20 October 2008 | The Kaohsiung Journal of Medical Sciences, Vol. 24, No. 7
  • Robert Yuan , and
  • Eric Chudler, Monitoring Editor

Submitted: 27 November 2006 Revised: 16 February 2007 Accepted: 19 February 2007

© 2007 by The American Society for Cell Biology

CTL Guide to the Critical Thinking Hub Area

Guidance for designing or teaching a Critical Thinking (CRT) course, including assignment resources and examples.

From the BU Hub Curriculum Guide

“The ability to think critically is the fundamental characteristic of an educated person. It is required for just, civil society and governance, prized by employers, and essential for the growth of wisdom. Critical thinking is what most people name first when asked about the essential components of a college education. From identifying and questioning assumptions, to weighing evidence before accepting an opinion or drawing a conclusion—all BU students will actively learn the habits of mind that characterize critical thinking, develop the self-discipline it requires, and practice it often, in varied contexts, across their education.” For more context around this Hub area, see this  Hub page .

Learning Outcomes

Courses and cocurricular activities in this area must have all outcomes.

  • Students will both gain critical thinking skills and be able to specify the components of critical thinking appropriate to a discipline or family of disciplines. These may include habits of distinguishing deductive from inductive modes of inference, methods of adjudicating disputes, recognizing common logical fallacies and cognitive biases, translating ordinary language into formal argument, distinguishing empirical claims about matters of fact from normative or evaluative judgments, and/or recognizing the ways in which emotional responses or cultural assumptions can affect reasoning processes.
  • Drawing on skills developed in class, students will be able to critically evaluate, analyze, and generate arguments, bodies of evidence, and/or claims, including their own.

If you are proposing a CRT course or if you want to learn more about these outcomes, please see this  Interpretive Document . Interpretive Documents, written by the   General Education Committee , are designed to answer questions faculty have raised about Hub policies, practices, and learning outcomes as a part of the course approval process. To learn more about the proposal process,  start here .

Area Specific Resources

  • Richard Paul , Center for Critical Thinking ( criticalthinking.org ).  Includes sample lessons, syllabi, teaching suggestions, and interdisciplinary resources and examples.
  • John Bean ’s Engaging Ideas – The Professor’s Guide to integrating Writing, Critical Thinking, and Active learning in the Classroom  is an invaluable resource for developing classroom activities and assignments that promote critical thinking and the scaffolding of writing.

Assignment Ideas

Weekly writing assignments.

These assignments are question-driven, thematic, and require students to integrate disciplinary and critical thinking literature to evaluate the validity of arguments in case studies, as well as the connections among method, theory, and practice in the case studies. Here, students are asked to utilize a chosen critical thinking framework throughout their written responses. These assignments can evolve during the semester by prompting students to address increasing complex case studies and arguments while also evaluating their own opinions using evidence from the readings. Along the way, students have ample opportunities for self-reflection, peer feedback, and coaching by the instructor.

Argument Mapping

A visual technique that allows students to analyze persuasive prose. This technique allows students to evaluate arguments–that is, distinguish valid from invalid arguments, and evaluate the soundness of different arguments. Advanced usage can help students organize and navigate complex information, encourage clearly articulated reasoning, and promote quick and effective communication. To learn more, please explore the following resources:

  • Carnegie Mellon University’s Open Learning Initiative course on this topic provides an excellent i ntroduction to exploring and understanding arguments. The course explains what the parts of an argument are, how to break arguments into their component parts, and how to create diagrams to show how those parts relate to each other.
  • Philmaps.com provides a handout  that introduces the concept of argument mapping to students, and also includes a number of sample activities that faculty can use to introduce students to argument mapping.
  • Mindmup’s Argument Visualization platform is an online mind map tool easily leveraged for creating argument maps.

Research Proposal and Final Research Paper

Demonstrates students’ ability to identify, distinguish, and assess ideological and evaluative claims and judgments about the selected research topic. Throughout the semester, students have the opportunity to practice their ability to evaluate the validity of arguments, including their own beliefs about the topic. Formative and summative assessments are provided to students at regular intervals and during each stage of the project.

Facilitating discussion that Presses Students for Accuracy and Expanded Reasoning . This resource is part of Harvard University’s Graduate School of Education “Instructional Moves” video series.

Additional sample assignments and assessments can be found throughout the selected Resources section located above.

Course Design Questions

As you are integrating critical thinking into your course, here are a few questions that you might consider:

  • What framework/vocabulary/process do you use to teach the key elements of critical thinking in your course?
  • What assigned readings or other materials do you use to teach critical thinking specifically?
  • Do students have opportunities throughout the semester to apply and practice these skills and receive feedback?
  • What graded assignments evaluate how well students can both identify the key elements of critical thinking and demonstrate their ability to evaluate the validity of arguments (including their own)?

You may also be interested in:

Thinking critically in college workshop, ctl guide to the teamwork/collaboration hub area, ctl guide to writing-intensive hub courses, ctl guide to the individual in community hub area, ctl guide to digital/multimedia expression, oral & signed communication hub guide, creativity/innovation hub guide, research & information literacy hub guide.

Classroom Q&A

With larry ferlazzo.

In this EdWeek blog, an experiment in knowledge-gathering, Ferlazzo will address readers’ questions on classroom management, ELL instruction, lesson planning, and other issues facing teachers. Send your questions to [email protected]. Read more from this blog.

Eight Instructional Strategies for Promoting Critical Thinking

critical thinking learning outcomes writing

  • Share article

(This is the first post in a three-part series.)

The new question-of-the-week is:

What is critical thinking and how can we integrate it into the classroom?

This three-part series will explore what critical thinking is, if it can be specifically taught and, if so, how can teachers do so in their classrooms.

Today’s guests are Dara Laws Savage, Patrick Brown, Meg Riordan, Ph.D., and Dr. PJ Caposey. Dara, Patrick, and Meg were also guests on my 10-minute BAM! Radio Show . You can also find a list of, and links to, previous shows here.

You might also be interested in The Best Resources On Teaching & Learning Critical Thinking In The Classroom .

Current Events

Dara Laws Savage is an English teacher at the Early College High School at Delaware State University, where she serves as a teacher and instructional coach and lead mentor. Dara has been teaching for 25 years (career preparation, English, photography, yearbook, newspaper, and graphic design) and has presented nationally on project-based learning and technology integration:

There is so much going on right now and there is an overload of information for us to process. Did you ever stop to think how our students are processing current events? They see news feeds, hear news reports, and scan photos and posts, but are they truly thinking about what they are hearing and seeing?

I tell my students that my job is not to give them answers but to teach them how to think about what they read and hear. So what is critical thinking and how can we integrate it into the classroom? There are just as many definitions of critical thinking as there are people trying to define it. However, the Critical Think Consortium focuses on the tools to create a thinking-based classroom rather than a definition: “Shape the climate to support thinking, create opportunities for thinking, build capacity to think, provide guidance to inform thinking.” Using these four criteria and pairing them with current events, teachers easily create learning spaces that thrive on thinking and keep students engaged.

One successful technique I use is the FIRE Write. Students are given a quote, a paragraph, an excerpt, or a photo from the headlines. Students are asked to F ocus and respond to the selection for three minutes. Next, students are asked to I dentify a phrase or section of the photo and write for two minutes. Third, students are asked to R eframe their response around a specific word, phrase, or section within their previous selection. Finally, students E xchange their thoughts with a classmate. Within the exchange, students also talk about how the selection connects to what we are covering in class.

There was a controversial Pepsi ad in 2017 involving Kylie Jenner and a protest with a police presence. The imagery in the photo was strikingly similar to a photo that went viral with a young lady standing opposite a police line. Using that image from a current event engaged my students and gave them the opportunity to critically think about events of the time.

Here are the two photos and a student response:

F - Focus on both photos and respond for three minutes

In the first picture, you see a strong and courageous black female, bravely standing in front of two officers in protest. She is risking her life to do so. Iesha Evans is simply proving to the world she does NOT mean less because she is black … and yet officers are there to stop her. She did not step down. In the picture below, you see Kendall Jenner handing a police officer a Pepsi. Maybe this wouldn’t be a big deal, except this was Pepsi’s weak, pathetic, and outrageous excuse of a commercial that belittles the whole movement of people fighting for their lives.

I - Identify a word or phrase, underline it, then write about it for two minutes

A white, privileged female in place of a fighting black woman was asking for trouble. A struggle we are continuously fighting every day, and they make a mockery of it. “I know what will work! Here Mr. Police Officer! Drink some Pepsi!” As if. Pepsi made a fool of themselves, and now their already dwindling fan base continues to ever shrink smaller.

R - Reframe your thoughts by choosing a different word, then write about that for one minute

You don’t know privilege until it’s gone. You don’t know privilege while it’s there—but you can and will be made accountable and aware. Don’t use it for evil. You are not stupid. Use it to do something. Kendall could’ve NOT done the commercial. Kendall could’ve released another commercial standing behind a black woman. Anything!

Exchange - Remember to discuss how this connects to our school song project and our previous discussions?

This connects two ways - 1) We want to convey a strong message. Be powerful. Show who we are. And Pepsi definitely tried. … Which leads to the second connection. 2) Not mess up and offend anyone, as had the one alma mater had been linked to black minstrels. We want to be amazing, but we have to be smart and careful and make sure we include everyone who goes to our school and everyone who may go to our school.

As a final step, students read and annotate the full article and compare it to their initial response.

Using current events and critical-thinking strategies like FIRE writing helps create a learning space where thinking is the goal rather than a score on a multiple-choice assessment. Critical-thinking skills can cross over to any of students’ other courses and into life outside the classroom. After all, we as teachers want to help the whole student be successful, and critical thinking is an important part of navigating life after they leave our classrooms.

usingdaratwo

‘Before-Explore-Explain’

Patrick Brown is the executive director of STEM and CTE for the Fort Zumwalt school district in Missouri and an experienced educator and author :

Planning for critical thinking focuses on teaching the most crucial science concepts, practices, and logical-thinking skills as well as the best use of instructional time. One way to ensure that lessons maintain a focus on critical thinking is to focus on the instructional sequence used to teach.

Explore-before-explain teaching is all about promoting critical thinking for learners to better prepare students for the reality of their world. What having an explore-before-explain mindset means is that in our planning, we prioritize giving students firsthand experiences with data, allow students to construct evidence-based claims that focus on conceptual understanding, and challenge students to discuss and think about the why behind phenomena.

Just think of the critical thinking that has to occur for students to construct a scientific claim. 1) They need the opportunity to collect data, analyze it, and determine how to make sense of what the data may mean. 2) With data in hand, students can begin thinking about the validity and reliability of their experience and information collected. 3) They can consider what differences, if any, they might have if they completed the investigation again. 4) They can scrutinize outlying data points for they may be an artifact of a true difference that merits further exploration of a misstep in the procedure, measuring device, or measurement. All of these intellectual activities help them form more robust understanding and are evidence of their critical thinking.

In explore-before-explain teaching, all of these hard critical-thinking tasks come before teacher explanations of content. Whether we use discovery experiences, problem-based learning, and or inquiry-based activities, strategies that are geared toward helping students construct understanding promote critical thinking because students learn content by doing the practices valued in the field to generate knowledge.

explorebeforeexplain

An Issue of Equity

Meg Riordan, Ph.D., is the chief learning officer at The Possible Project, an out-of-school program that collaborates with youth to build entrepreneurial skills and mindsets and provides pathways to careers and long-term economic prosperity. She has been in the field of education for over 25 years as a middle and high school teacher, school coach, college professor, regional director of N.Y.C. Outward Bound Schools, and director of external research with EL Education:

Although critical thinking often defies straightforward definition, most in the education field agree it consists of several components: reasoning, problem-solving, and decisionmaking, plus analysis and evaluation of information, such that multiple sides of an issue can be explored. It also includes dispositions and “the willingness to apply critical-thinking principles, rather than fall back on existing unexamined beliefs, or simply believe what you’re told by authority figures.”

Despite variation in definitions, critical thinking is nonetheless promoted as an essential outcome of students’ learning—we want to see students and adults demonstrate it across all fields, professions, and in their personal lives. Yet there is simultaneously a rationing of opportunities in schools for students of color, students from under-resourced communities, and other historically marginalized groups to deeply learn and practice critical thinking.

For example, many of our most underserved students often spend class time filling out worksheets, promoting high compliance but low engagement, inquiry, critical thinking, or creation of new ideas. At a time in our world when college and careers are critical for participation in society and the global, knowledge-based economy, far too many students struggle within classrooms and schools that reinforce low-expectations and inequity.

If educators aim to prepare all students for an ever-evolving marketplace and develop skills that will be valued no matter what tomorrow’s jobs are, then we must move critical thinking to the forefront of classroom experiences. And educators must design learning to cultivate it.

So, what does that really look like?

Unpack and define critical thinking

To understand critical thinking, educators need to first unpack and define its components. What exactly are we looking for when we speak about reasoning or exploring multiple perspectives on an issue? How does problem-solving show up in English, math, science, art, or other disciplines—and how is it assessed? At Two Rivers, an EL Education school, the faculty identified five constructs of critical thinking, defined each, and created rubrics to generate a shared picture of quality for teachers and students. The rubrics were then adapted across grade levels to indicate students’ learning progressions.

At Avenues World School, critical thinking is one of the Avenues World Elements and is an enduring outcome embedded in students’ early experiences through 12th grade. For instance, a kindergarten student may be expected to “identify cause and effect in familiar contexts,” while an 8th grader should demonstrate the ability to “seek out sufficient evidence before accepting a claim as true,” “identify bias in claims and evidence,” and “reconsider strongly held points of view in light of new evidence.”

When faculty and students embrace a common vision of what critical thinking looks and sounds like and how it is assessed, educators can then explicitly design learning experiences that call for students to employ critical-thinking skills. This kind of work must occur across all schools and programs, especially those serving large numbers of students of color. As Linda Darling-Hammond asserts , “Schools that serve large numbers of students of color are least likely to offer the kind of curriculum needed to ... help students attain the [critical-thinking] skills needed in a knowledge work economy. ”

So, what can it look like to create those kinds of learning experiences?

Designing experiences for critical thinking

After defining a shared understanding of “what” critical thinking is and “how” it shows up across multiple disciplines and grade levels, it is essential to create learning experiences that impel students to cultivate, practice, and apply these skills. There are several levers that offer pathways for teachers to promote critical thinking in lessons:

1.Choose Compelling Topics: Keep it relevant

A key Common Core State Standard asks for students to “write arguments to support claims in an analysis of substantive topics or texts using valid reasoning and relevant and sufficient evidence.” That might not sound exciting or culturally relevant. But a learning experience designed for a 12th grade humanities class engaged learners in a compelling topic— policing in America —to analyze and evaluate multiple texts (including primary sources) and share the reasoning for their perspectives through discussion and writing. Students grappled with ideas and their beliefs and employed deep critical-thinking skills to develop arguments for their claims. Embedding critical-thinking skills in curriculum that students care about and connect with can ignite powerful learning experiences.

2. Make Local Connections: Keep it real

At The Possible Project , an out-of-school-time program designed to promote entrepreneurial skills and mindsets, students in a recent summer online program (modified from in-person due to COVID-19) explored the impact of COVID-19 on their communities and local BIPOC-owned businesses. They learned interviewing skills through a partnership with Everyday Boston , conducted virtual interviews with entrepreneurs, evaluated information from their interviews and local data, and examined their previously held beliefs. They created blog posts and videos to reflect on their learning and consider how their mindsets had changed as a result of the experience. In this way, we can design powerful community-based learning and invite students into productive struggle with multiple perspectives.

3. Create Authentic Projects: Keep it rigorous

At Big Picture Learning schools, students engage in internship-based learning experiences as a central part of their schooling. Their school-based adviser and internship-based mentor support them in developing real-world projects that promote deeper learning and critical-thinking skills. Such authentic experiences teach “young people to be thinkers, to be curious, to get from curiosity to creation … and it helps students design a learning experience that answers their questions, [providing an] opportunity to communicate it to a larger audience—a major indicator of postsecondary success.” Even in a remote environment, we can design projects that ask more of students than rote memorization and that spark critical thinking.

Our call to action is this: As educators, we need to make opportunities for critical thinking available not only to the affluent or those fortunate enough to be placed in advanced courses. The tools are available, let’s use them. Let’s interrogate our current curriculum and design learning experiences that engage all students in real, relevant, and rigorous experiences that require critical thinking and prepare them for promising postsecondary pathways.

letsinterrogate

Critical Thinking & Student Engagement

Dr. PJ Caposey is an award-winning educator, keynote speaker, consultant, and author of seven books who currently serves as the superintendent of schools for the award-winning Meridian CUSD 223 in northwest Illinois. You can find PJ on most social-media platforms as MCUSDSupe:

When I start my keynote on student engagement, I invite two people up on stage and give them each five paper balls to shoot at a garbage can also conveniently placed on stage. Contestant One shoots their shot, and the audience gives approval. Four out of 5 is a heckuva score. Then just before Contestant Two shoots, I blindfold them and start moving the garbage can back and forth. I usually try to ensure that they can at least make one of their shots. Nobody is successful in this unfair environment.

I thank them and send them back to their seats and then explain that this little activity was akin to student engagement. While we all know we want student engagement, we are shooting at different targets. More importantly, for teachers, it is near impossible for them to hit a target that is moving and that they cannot see.

Within the world of education and particularly as educational leaders, we have failed to simplify what student engagement looks like, and it is impossible to define or articulate what student engagement looks like if we cannot clearly articulate what critical thinking is and looks like in a classroom. Because, simply, without critical thought, there is no engagement.

The good news here is that critical thought has been defined and placed into taxonomies for decades already. This is not something new and not something that needs to be redefined. I am a Bloom’s person, but there is nothing wrong with DOK or some of the other taxonomies, either. To be precise, I am a huge fan of Daggett’s Rigor and Relevance Framework. I have used that as a core element of my practice for years, and it has shaped who I am as an instructional leader.

So, in order to explain critical thought, a teacher or a leader must familiarize themselves with these tried and true taxonomies. Easy, right? Yes, sort of. The issue is not understanding what critical thought is; it is the ability to integrate it into the classrooms. In order to do so, there are a four key steps every educator must take.

  • Integrating critical thought/rigor into a lesson does not happen by chance, it happens by design. Planning for critical thought and engagement is much different from planning for a traditional lesson. In order to plan for kids to think critically, you have to provide a base of knowledge and excellent prompts to allow them to explore their own thinking in order to analyze, evaluate, or synthesize information.
  • SIDE NOTE – Bloom’s verbs are a great way to start when writing objectives, but true planning will take you deeper than this.

QUESTIONING

  • If the questions and prompts given in a classroom have correct answers or if the teacher ends up answering their own questions, the lesson will lack critical thought and rigor.
  • Script five questions forcing higher-order thought prior to every lesson. Experienced teachers may not feel they need this, but it helps to create an effective habit.
  • If lessons are rigorous and assessments are not, students will do well on their assessments, and that may not be an accurate representation of the knowledge and skills they have mastered. If lessons are easy and assessments are rigorous, the exact opposite will happen. When deciding to increase critical thought, it must happen in all three phases of the game: planning, instruction, and assessment.

TALK TIME / CONTROL

  • To increase rigor, the teacher must DO LESS. This feels counterintuitive but is accurate. Rigorous lessons involving tons of critical thought must allow for students to work on their own, collaborate with peers, and connect their ideas. This cannot happen in a silent room except for the teacher talking. In order to increase rigor, decrease talk time and become comfortable with less control. Asking questions and giving prompts that lead to no true correct answer also means less control. This is a tough ask for some teachers. Explained differently, if you assign one assignment and get 30 very similar products, you have most likely assigned a low-rigor recipe. If you assign one assignment and get multiple varied products, then the students have had a chance to think deeply, and you have successfully integrated critical thought into your classroom.

integratingcaposey

Thanks to Dara, Patrick, Meg, and PJ for their contributions!

Please feel free to leave a comment with your reactions to the topic or directly to anything that has been said in this post.

Consider contributing a question to be answered in a future post. You can send one to me at [email protected] . When you send it in, let me know if I can use your real name if it’s selected or if you’d prefer remaining anonymous and have a pseudonym in mind.

You can also contact me on Twitter at @Larryferlazzo .

Education Week has published a collection of posts from this blog, along with new material, in an e-book form. It’s titled Classroom Management Q&As: Expert Strategies for Teaching .

Just a reminder; you can subscribe and receive updates from this blog via email (The RSS feed for this blog, and for all Ed Week articles, has been changed by the new redesign—new ones won’t be available until February). And if you missed any of the highlights from the first nine years of this blog, you can see a categorized list below.

  • This Year’s Most Popular Q&A Posts
  • Race & Racism in Schools
  • School Closures & the Coronavirus Crisis
  • Classroom-Management Advice
  • Best Ways to Begin the School Year
  • Best Ways to End the School Year
  • Student Motivation & Social-Emotional Learning
  • Implementing the Common Core
  • Facing Gender Challenges in Education
  • Teaching Social Studies
  • Cooperative & Collaborative Learning
  • Using Tech in the Classroom
  • Student Voices
  • Parent Engagement in Schools
  • Teaching English-Language Learners
  • Reading Instruction
  • Writing Instruction
  • Education Policy Issues
  • Differentiating Instruction
  • Math Instruction
  • Science Instruction
  • Advice for New Teachers
  • Author Interviews
  • Entering the Teaching Profession
  • The Inclusive Classroom
  • Learning & the Brain
  • Administrator Leadership
  • Teacher Leadership
  • Relationships in Schools
  • Professional Development
  • Instructional Strategies
  • Best of Classroom Q&A
  • Professional Collaboration
  • Classroom Organization
  • Mistakes in Education
  • Project-Based Learning

I am also creating a Twitter list including all contributors to this column .

The opinions expressed in Classroom Q&A With Larry Ferlazzo are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Sign Up for EdWeek Update

Edweek top school jobs.

Image of someone reading news on their phone.

Sign Up & Sign In

module image 9

Banner

Essential Learning Outcomes: Critical/Creative Thinking

  • Civic Responsibility
  • Critical/Creative Thinking
  • Cultural Sensitivity
  • Information Literacy
  • Oral Communication
  • Quantitative Reasoning
  • Written Communication
  • Diversity, Equity & Inclusion

Description

Guide to Critical/Creative Thinking

Intended Learning Outcome:

Analyze, evaluate, and synthesize information in order to consider problems/ideas and transform them in innovative or imaginative ways (See below for definitions)

Assessment may include but is not limited to the following criteria and intended outcomes:

Analyze problems/ideas critically and/or creatively

  • Formulates appropriate questions to consider problems/issues
  • Evaluates costs and benefits of a solution
  • Identifies possible solutions to problems or resolution to issues
  • Applies innovative and imaginative approaches to problems/ideas

Synthesize information/ideas into a coherent whole

  • Seeks and compares information that leads to informed decisions/opinions
  • Applies fact and opinion appropriately
  • Expands upon ideas to foster new lines of inquiry
  • Synthesizes ideas into a coherent whole

Evaluate synthesized information in order to transform problems/ideas in innovative or imaginative ways

  • Applies synthesized information to inform effective decisions
  • Experiments with creating a novel idea, question, or product
  • Uses new approaches and takes appropriate risks without going beyond the guidelines of the assignment
  • Evaluates and reflects on the decision through a process that takes into account the complexities of an issue

From Association of American Colleges & Universities, LEAP outcomes and VALUE rubrics:   Critical thinking  is a habit of mind characterized by the comprehensive exploration of issues, ideas, artifacts, and events before accepting or formulating an opinion or conclusion.

Creative thinking  is both the capacity to combine or synthesize existing ideas, images, or expertise in original ways and the experience of thinking, reacting, and working in an imaginative way characterized by a high degree of innovation, divergent thinking, and risk taking.

Elements, excerpts, and ideas borrowed with permission form Assessing Outcomes and Improving Achievement: Tips and tools for Using Rubrics , edited by Terrel L. Rhodes. Copyright 2010 by the Association of American Colleges and Universities.

How to Align - Critical/Creative Thinking

  • Critical/Creative Thinking ELO Tutorial

Critical/Creative Thinking Rubric

Analyze, evaluate, and synthesize information in order to consider problems/ideas and transform them into innovative or imaginative ways.

Criteria Inadequate Developing Competent Proficient
Analyze
problems/ideas
critically and/or
creatively
Does not analyze
problems/ideas
Analyzes
problems/ideas
but not critically
and/or creatively
Begins to analyze
the problems/ideas
critically and/or
creatively
analyzes the
problems/ideas
critically and/or
creatively
Synthesize
information/ideas
in order to
synthesize into a 
coherent whole
Does not
synthesize
information/ideas
Begins to
synthesize
information/ideas
but not into a 
coherent whole
Synthesizes
information/ideas
but not into a 
coherent whole
Synthesizes
information/ideas
into a coherent
whole
Evaluate
synthesized
information in
order to 
transform
problems/ideas
in innovative
Does not evaluate
synthesized
information in
order to transform
problems/ideas
Evaluates
synthesized 
information and
begins to 
transform
problems/ideas
Evaluates
synthesized
information and
transforms
problems/ideas
Evaluates, synthesized
information and
transforms
problems/ideas
accounting for their
complexities or nuances

Elements, excerpts, and ideas borrowed with permission form  Assessing Outcomes and Improving Achievement: Tips and tools for Using Rubrics , edited by Terrel L. Rhodes. Copyright 2010 by the Association of American Colleges and Universities.

Sample Assignments

  • Cleveland Museum of Art tour (Just Mercy) Assignment contributed by Chris Wolken, Matt Lafferty, Luke Schuleter and Sara Clark.
  • Disaster Analysis This assignment was created by faculty at Durham College in Canada The purpose of this assignment is to evaluate students’ ability to think critically about how natural disasters are portrayed in the media.
  • Laboratory Report-Critical Thinking Assignment contributed by Anne Distler.
  • (Re)Imaginings assignment ENG 1020 Assignment contributed by Sara Fuller.
  • Sustainability Project-Part 1 Waste Journal Assignment contributed by Anne Distler.
  • Sustainability Project-Part 2 Research Assignment contributed by Anne Distler.
  • Sustainability Project-Part 3 Waste Journal Continuation Assignment contributed by Anne Distler.
  • Sustainability Project-Part 4 Reflection Assignment contributed by Anne Distler.
  • Reconstructed Landscapes (VCPH) Assignment contributed by Jonathan Wayne
  • Book Cover Design (VCIL)) Assignment contributed by George Kopec

Ask a Librarian

  • << Previous: Civic Responsibility
  • Next: Cultural Sensitivity >>
  • Last Updated: Jan 8, 2024 12:20 PM
  • URL: https://libguides.tri-c.edu/Essential

critical thinking learning outcomes writing

Learning Outcomes

  • Emerson 5 C’s
  • Communication Sciences & Disorders
  • Communication Studies
  • Marketing Communication
  • Performing Arts
  • Visual & Media Arts
  • Writing, Literature, & Publishing
  • Honors Program
  • Minor Programs
  • Foundations
  • Perspectives
  • Graduate Programs
  • Fink's Taxonomy of Significant Learning
  • A Basic Guide to SLOs

Writing Effective SLOs

Developing strong SLOs involves clearly articulating what students will be able to demonstrate by the end of a course or program. Here are some key elements to consider:

  • Action Verbs: Use verbs that describe observable behaviors, such as analyze, create, interpret, synthesize, evaluate .
  • Focus on Learning: The focus should be on what students will learn , not what they will be taught (e.g., "Students will be able to..." rather than "The course will cover...").
  • Specificity: Clearly define the knowledge, skills, or dispositions students will develop.
  • Measurability: Consider how you will assess whether students have achieved the learning outcome.

Examples of Fink's Taxonomy in Action:

  • Foundational Knowledge (Remembering): Students will be able to recall the key tenets of [Theory X] and [Theory Y] in organizational psychology.
  • Application (Applying): Students will be able to design a research study that utilizes qualitative methods to investigate a social issue within their community.
  • Integration (Connecting): Students will be able to compare and contrast the ethical frameworks presented in different religious traditions.

By following these tips and leveraging Fink's Taxonomy, you can craft SLOs that are clear, measurable, and aligned with your course or program objectives. Let's work together to empower students on their academic journey and prepare them for lifelong success.

OAA's "Dos" and "Don'ts" for SLOs

Below are several quick tips to keep in mind when designing or revising your SLOs:

  • Focus on students: Learning outcomes should give students a clear picture of what they will be learning.
  • Start with the end in mind: Identify what students will know, understand, and be able to do at the end of the course or program.
  • Use simple langauge: Learning outcomes should be easy to understand for students and other who are not experts in the field. 
  • Employ Fink's "Taxonomy of Significant Learning": When writing learning outcomes, consider Fink's six categories of significant learning (Foundational Knowledge, Application, Integration, Human Dimension, Caring, and Learning How to learn) which promote a holistic, student-centered approach to learning.
  • Be clear and specific: Learning outcomes should be clear, specific, and measurable, using action verbs that describe the expected performance.
  • Limit each learning outcome to one measurable outcome (i.e. action verb): One measurable student behavior per learning outcome is more likely to produce reliable, potentially valid assessment of student learning.
  • Consider diverse learners: Learning outcomes should be inclusive and considerate of the diversity of students' backgrounds, experiences, abilities, and learning styles. 
  • Focus on you: Avoid learning outcomes that describe what or how you will teach and instead focus on what the students will be able to know, do, or demonstrate. 
  • Use jargon and vague language: Avoid using jargon and area-specific language so that learning outcomes are comprehensible to everyone. 
  • Include multiple verbs in a single learning outcome: Multiple verbs in a learning outcome create multiple problems: Which is the one that will be assessed? If more than one at the same time, will they be weighted differently in determining whether a student has attained the outcome?
  • Incorporate a product/deliverable into learning outcomes: Learning outcomes should describe the knowledge, skills, and abilities that students are expected to gain from a course or program of study. Including products/deliverables (i.e. oral presentations, capstone projects) can limit the scope of learning to only whether a specific project was completed.
  • Have too many learning outcomes: Best practice indicates having 5-7 learning outcomes. Limiting the number of outcomes improves the quality and effectiveness of the curriculum, enhances student engagement and understanding, and simplifies the assessment process.
  • Write learning outcomes that are too narrow: Avoid writing outcomes that are too narrow or focused on specific tasks, as this can limit the scope of learning. 
  • Write learning outcomes that are too broad: Learning outcomes that are too broad make it difficult to assess whether students have achieved the intended learnin

OAA is here to help!

The Office of Academic Assessment is happy to consult with you to design or revise your course's student learning outcomes. Please contact us if you'd like to chat about SLOs!

  • << Previous: Fink's Taxonomy of Significant Learning
  • Next: Resources >>
  • Last Updated: Jul 17, 2024 5:11 PM
  • URL: https://guides.library.emerson.edu/oaa/learningoutcomes

Office of Academic Assessment Emerson College 120 Boylston Street Boston MA 02116

Phone: Call us at (617) 824-8668 Email: Email us at [email protected]

University of Leeds logo

  • Study and research support
  • Academic skills

Critical thinking

Critical writing.

Students sometimes receive feedback such as "your essay is too descriptive" or "you need to show more critical analysis". While some description may be necessary – for instance if you are providing background information – most university assignments require you to produce work that is analytical and critical in its approach.

Your tutors want to know what you think

Your writing needs to show your interpretation of the evidence and source material, how you have used that information to demonstrate your understanding, and your subsequent position on the topic. Being critical in your writing means engaging in academic debates and research happening in your subject area.

The sources you select, the way you show how they agree or disagree with other pieces of evidence, and the way you structure your argument will all show your thought process and how you have understood the information you have read.

Use evidence to strengthen your position

Always keep your reader in mind and try to anticipate the questions they would ask — refer back to the Critical thinking questions (PDF) to help you with this. You can use evidence to help you strengthen your position, answer readers' questions, and "neutralise" opposing points of view.

Remember to keep descriptive statements to a minimum — there is no need to provide large amounts of background or historical information.

Make sure you move from description to analysis and evaluation . Give your interpretation of the facts, and explain the significance, consequences and implications of the statements you have made.

See our advice on structuring a paragraph for more information on how to attach analysis and evaluation to each point you make.

Descriptive vs critical writing examples

The following examples demonstrate the difference between descriptive writing and critical/analytical writing. They are taken from: Cottrell, S. 2003. The Study Skills Handbook . 2nd ed. London: Palgrave.

State what happened vs identify its significance

To write critically you will need to not only describe what happened, but also identify the significance of what happened.

Descriptive example

"The data shows that the incidence (new cases) of asthma rates in children under 15 years old increased rapidly from 1977, peaking in 1993 and then declining, though rates still remain significantly higher than pre-1976 levels."

Critical example

"The trend, from 1977 until 1993, of a rapid rise in rates of asthma diagnosis in children under 15 years, suggests that one of the causal factors was particularly prevalent during this time, but has since declined in importance or effect."

Explain the theory vs show its relevance

Descriptive writing will explain what the theory says. To write critically you need to go further and show why that theory is relevant.

"Carl Rogers' theory of a person-centred approach focuses on the freedom of the individual to determine what values should be used to measure successful personal outcomes or benefit, and is particularly relevant for social workers when wanting to take into account the diverse needs of the client group."

"Carl Rogers' theory of a person-centred approach is particularly suitable for social workers wanting to work with a client group with diverse needs because it allows the client to determine what values should be used to measure successful outcomes, rather than those externally determined by, for example, the service, state or dominant culture in society."

Note the method used vs indicate its appropriateness

Rather than simply noting the method used, which is the descriptive approach, a critical writer will show how appropriate that method was.

"In addition to competency-based questions, the candidates were asked to complete an in-tray exercise, which required them to allocate different priority levels to tasks, as an appropriate method to measure their likely performance in the actual job."

"In addition to competency-based questions, candidates were asked to complete an in-tray task prioritisation exercise. This was because it was considered a more effective way to measure likely performance in the actual role as the majority of the job would involve similar tasks, with little interaction with customers and therefore less requirement for highly developed communication skills."

You can apply our critical thinking model to your own work; use our Judging your own work (PDF) questions to help you decide if your writing is critical. These questions will take you through the description-analysis-evaluation stages. Take a look at further examples of descriptive writing vs. critical/analytical writing (PDF)

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.6(2); Summer 2007

Learning to Improve: Using Writing to Increase Critical Thinking Performance in General Education Biology

Ian j. quitadamo.

*Department of Biological Sciences, Central Washington University, Ellensburg, WA 98926-7537; and

Martha J. Kurtz

† Department of Chemistry, Central Washington University, Ellensburg, WA 98926-7539

Increasingly, national stakeholders express concern that U.S. college graduates cannot adequately solve problems and think critically. As a set of cognitive abilities, critical thinking skills provide students with tangible academic, personal, and professional benefits that may ultimately address these concerns. As an instructional method, writing has long been perceived as a way to improve critical thinking. In the current study, the researchers compared critical thinking performance of students who experienced a laboratory writing treatment with those who experienced traditional quiz-based laboratory in a general education biology course. The effects of writing were determined within the context of multiple covariables. Results indicated that the writing group significantly improved critical thinking skills whereas the nonwriting group did not. Specifically, analysis and inference skills increased significantly in the writing group but not the nonwriting group. Writing students also showed greater gains in evaluation skills; however, these were not significant. In addition to writing, prior critical thinking skill and instructor significantly affected critical thinking performance, whereas other covariables such as gender, ethnicity, and age were not significant. With improved critical thinking skill, general education biology students will be better prepared to solve problems as engaged and productive citizens.

INTRODUCTION

A national call to improve critical thinking in science.

In the past several years, an increasing number of national reports indicate a growing concern over the effectiveness of higher education teaching practices and the decreased science (and math) performance of U.S. students relative to other industrialized countries ( Project Kaleidoscope, 2006 ). A variety of national stakeholders, including business and educational leaders, politicians, parents, and public agencies, have called for long-term transformation of the K–20 educational system to produce graduates who are well trained in science, can engage intelligently in global issues that require local action, and in general are better able to solve problems and think critically. Specifically, business leaders are calling for graduates who possess advanced analysis and communication skills, for instructional methods that improve lifelong learning, and ultimately for an educational system that builds a nation of innovative and effective thinkers ( Business-Higher Education Forum and American Council on Education, 2003 ). Education leaders are similarly calling for institutions of higher education to produce graduates who think critically, communicate effectively, and who employ lifelong learning skills to address important scientific and civic issues ( Association of American Colleges and Universities, [AACU] 2005 ).

Many college faculty consider critical thinking to be one of the most important indicators of student learning quality. In its 2005 national report, the AACU indicated that 93% of higher education faculty perceived analytical and critical thinking to be an essential learning outcome (AACU, 2005) whereas 87% of undergraduate students indicated that college experiences contributed to their ability to think analytically and creatively. This same AACU report showed that only 6% of undergraduate seniors demonstrated critical thinking proficiency based on Educational Testing Services standardized assessments from 2003 to 2004. During the same time frame, data from the ACT Collegiate Assessment of Academic Proficiency test showed a similar trend, with undergraduates improving their critical thinking less than 1 SD from freshman to senior year. Thus, it appears a discrepancy exists between faculty expectations of critical thinking and students' ability to perceive and demonstrate critical thinking proficiency using standardized assessments (AACU, 2005).

Teaching that supports the development of critical thinking skills has become a cornerstone of nearly every major educational objective since the Department of Education released its six goals for the nation's schools in 1990. In particular, goal three of the National Goals for Education stated that more students should be able to reason, solve problems, and apply knowledge. Goal six specifically stated that college graduates must be able to think critically ( Office of Educational Research and Improvement, 1991 ). Since 1990, American education has tried—with some success—to make a fundamental shift from traditional teacher-focused instruction to more student-centered constructivist learning that encourages discovery, reflection, and in general is thought to improve student critical thinking skill. National science organizations have supported this trend with recommendations to improve the advanced thinking skills that support scientific literacy ( American Association for Higher Education, 1989 ; National Research Council, 1995 ; National Science Foundation, 1996 ).

More recent reports describe the need for improved biological literacy as well as international competitiveness ( Bybee and Fuchs, 2006 ; Klymkowsky, 2006 ). Despite the collective call for enhanced problem solving and critical thinking, educators, researchers, and policymakers are discovering a lack of evidence in existing literature for methods that measurably improve critical thinking skills ( Tsui, 1998 , 2002 ). As more reports call for improved K–20 student performance, it is essential that research-supported teaching and learning practices be used to better help students develop the cognitive skills that underlie effective science learning ( Malcom et al., 2005 ; Bybee and Fuchs, 2006 ).

Critical Thinking

Although they are not always transparent to many college students, the academic and personal benefits of critical thinking are well established; students who can think critically tend to get better grades, are often better able to use reasoning in daily decisions ( U.S. Department of Education, 1990 ), and are generally more employable ( Carnevale and American Society for Training and Development, 1990 ; Holmes and Clizbe, 1997 ; National Academy of Sciences, 2005 ). By focusing on instructional efforts that develop critical thinking skills, it may be possible to increase student performance while satisfying national stakeholder calls for educational improvement and increased ability to solve problems as engaged and productive citizens.

Although academics and business professionals consider critical thinking skill to be a crucial outcome of higher education, many would have difficulty defining exactly what critical thinking is. Historically, there has been little agreement on how to conceptualize critical thinking. Of the literally dozens of definitions that exist, one of the most organized efforts to define (and measure) critical thinking emerged from research done by Peter Facione and others in the early 1990s. Their consensus work, referred to as the Delphi report, was accomplished by a group of 46 leading theorists, teachers, and critical thinking assessment specialists from a variety of academic and business disciplines ( Facione and American Philosophical Association, 1990 ). Initial results from the Delphi report were later confirmed in a national survey and replication study ( Jones et al., 1995 ). In short, the Delphi panel expert consensus describes critical thinking as a “process of purposeful self-regulatory judgment that drives problem-solving and decision-making” ( Facione and American Philosophical Association, 1990 ). This definition implies that critical thinking is an intentional, self-regulated process that provides a mechanism for solving problems and making decisions based on reasoning and logic, which is particularly useful when dealing with issues of national and global significance.

The Delphi conceptualization of critical thinking encompasses several cognitive skills that include: 1) analysis (the ability to break a concept or idea into component pieces in order to understand its structure and inherent relationships), 2) inference (the skills used to arrive at a conclusion by reconciling what is known with what is unknown), and 3) evaluation (the ability to weigh and consider evidence and make reasoned judgments within a given context). Other critical thinking skills that are similarly relevant to science include interpretation, explanation, and self-regulation ( Facione and American Philosophical Association, 1990 ). The concept of critical thinking includes behavioral tendencies or dispositions as well as cognitive skills ( Ennis, 1985 ); these include the tendency to seek truth, to be open-minded, to be analytical, to be orderly and systematic, and to be inquisitive ( Facione and American Philosophical Association, 1990 ). These behavioral tendencies also align closely with behaviors considered to be important in science. Thus, an increased focus on teaching critical thinking may directly benefit students who are engaged in science.

Prior research on critical thinking indicates that students' behavioral dispositions do not change in the short term ( Giancarlo and Facione, 2001 ), but cognitive skills can be developed over a relatively short period of time (Quitadamo, Brahler, and Crouch, unpublished results). In their longitudinal study of behavioral disposition toward critical thinking, Giancarlo and Facione (2001) discovered that undergraduate critical thinking disposition changed significantly after two years. Specifically, significant changes in student tendency to seek truth and confidence in thinking critically occurred during the junior and senior years. Also, females tended to be more open-minded and have more mature judgment than males ( Giancarlo and Facione, 2001 ). Although additional studies are necessary to confirm results from the Giancarlo study, existing research seems to indicate that changes in undergraduate critical thinking disposition are measured in years, not weeks.

In contrast to behavioral disposition, prior research indicates that critical thinking skills can be measurably changed in weeks. In their study of undergraduate critical thinking skill in university science and math courses, Quitadamo, Brahler, and Crouch (unpublished results) showed that critical thinking skills changed within 15 wk in response to Peer Led Team Learning (a national best practice for small group learning). This preliminary study provided some evidence that undergraduate critical thinking skills could be measurably improved within an academic semester, but provided no information about whether critical thinking skills could be changed during a shorter academic quarter. It was also unclear whether the development of critical thinking skills was a function of chronological time or whether it was related to instructional time.

Numerous studies provide anecdotal evidence for pedagogies that improve critical thinking, but much of existing research relies on student self-report, which limits the scope of interpretation. From the literature it is clear that, although critical thinking skills are some of the most valued outcomes of a quality education, additional research investigating the effects of instructional factors on critical thinking performance is necessary ( Tsui, 1998 , 2002 ).

Writing and Critical Thinking

Writing has been widely used as a tool for communicating ideas, but less is known about how writing can improve the thinking process itself ( Rivard, 1994 ; Klein, 2004 ). Writing is thought to be a vehicle for improving student learning ( Champagne and Kouba, 1999 ; Kelly and Chen, 1999 ; Keys, 1999 ; Hand and Prain, 2002 ), but too often is used as a means to regurgitate content knowledge and derive prescribed outcomes ( Keys, 1999 ; Keys et al., 1999 ). Historically, writing is thought to contribute to the development of critical thinking skills ( Kurfiss, and Association for the Study of Higher Education, 1988 ). Applebee (1984) suggested that writing improves thinking because it requires an individual to make his or her ideas explicit and to evaluate and choose among tools necessary for effective discourse. Resnick (1987) stressed that writing should provide an opportunity to think through arguments and that, if used in such a way, could serve as a “cultivator and an enabler of higher order thinking.” Marzano (1991) suggested that writing used as a means to restructure knowledge improves higher-order thinking. In this context, writing may provide opportunity for students to think through arguments and use higher-order thinking skills to respond to complex problems ( Marzano, 1991 ).

Writing has also been used as a strategy to improve conceptual learning. Initial work focused on how the recursive and reflective nature of the writing process contributes to student learning ( Applebee, 1984 ; Langer and Applebee, 1985 , 1987 ; Ackerman, 1993 ). However, conclusions from early writing to learn studies were limited by confounding research designs and mismatches between writing activities and measures of student learning ( Ackerman, 1993 ). Subsequent work has focused on how writing within disciplines helps students to learn content and how to think. Specifically, writing within disciplines is thought to require deeper analytical thinking ( Langer and Applebee, 1987 ), which is closely aligned with critical thinking.

The influence of writing on critical thinking is less defined in science. Researchers have repeatedly called for more empirical investigations of writing in science; however, few provide such evidence ( Rivard, 1994 ; Tsui, 1998 ; Daempfle, 2002 ; Klein, 2004 ). In his extensive review of writing research, Rivard (1994) indicated that gaps in writing research limit its inferential scope, particularly within the sciences. Specifically, Rivard and others indicate that, despite the volume of writing students are asked to produce during their education, they are not learning to use writing to improve their awareness of thinking processes ( Resnick, 1987 ; Howard, 1990 ). Existing studies are limited because writing has been used either in isolation or outside authentic classroom contexts. Factors like gender, ethnicity, and academic ability that are not directly associated with writing but may nonetheless influence its effectiveness have also not been sufficiently accounted for in previous work ( Rivard, 1994 ).

A more recent review by Daempfle (2002) similarly indicates the need for additional research to clarify relationships between writing and critical thinking in science. In his review, Daempfle identified nine empirical studies that generally support the hypothesis that students who experience writing (and other nontraditional teaching methods) have higher reasoning skills than students who experience traditional science instruction. Of the relatively few noninstructional variables identified in those studies, gender and major did not affect critical thinking performance; however, the amount of time spent on and the explicitness of instruction to teach reasoning skills did affect overall critical thinking performance. Furthermore, the use of writing and other nontraditional teaching methods did not appear to negatively affect content knowledge acquisition ( Daempfle, 2002 ). Daempfle justified his conclusions by systematically describing the methodological inconsistencies for each study. Specifically, incomplete sample descriptions, the use of instruments with insufficient validity and reliability, the absence of suitable comparison groups, and the lack of statistical covariate analyses limit the scope and generalizability of existing studies of writing and critical thinking ( Daempfle, 2002 ).

Writing in the Biological Sciences

The conceptual nature and reliance on the scientific method as a means of understanding make the field of biology a natural place to teach critical thinking through writing. Some work has been done in this area, with literature describing various approaches to writing in the biological sciences that range from linked biology and English courses, writing across the biology curriculum, and directed use of writing to improve reasoning in biology courses ( Ebert-May et al., 1997 ; Holyoak, 1998 ; Taylor and Sobota, 1998 ; Steglich, 2000 ; Lawson, 2001 ; Kokkala and Gessell, 2003 ; Tessier, 2006 ). In their work on integrated biology and English, Taylor and Sobota (1998) discussed several problem areas that affected both biology and English students, including anxiety and frustration associated with writing, difficulty expressing thoughts clearly and succinctly, and a tendency to have strong negative responses to writing critique. Although the authors delineate the usefulness of several composition strategies for writing in biology ( Taylor and Sobota, 1998 ), it was unclear whether student data were used to support their recommendations. Kokkala and Gessell (2003) used English students to evaluate articles written by biology students. Biology students first reflected on initial editorial comments made by English students, and then resubmitted their work for an improved grade. In turn, English students had to justify their editorial comments with written work of their own. Qualitative results generated from a list of reflective questions at the end of the writing experience seemed to indicate that both groups of students improved editorial skills and writing logic. However, no formal measures of student editorial skill were collected before biology-English student collaboration, so no definitive conclusions on the usefulness of this strategy could be made.

Taking a slightly different tack, Steglich (2000) informally assessed student attitudes in nonmajors biology courses, and noted that writing produced positive changes in student attitudes toward biology. However, the author acknowledged that this work was not a research study. Finally, Tessier (2006) showed that students enrolled in a nonmajors ecology course significantly improved writing technical skills and committed fewer errors of fact regarding environmental issues in response to a writing treatment. Attitudes toward environmental issues also improved ( Tessier, 2006 ). Although this study surveyed students at the beginning and the end of the academic term and also tracked student progress during the quarter, instrument validity and reliability were not provided. The generalizability of results was further limited because of an overreliance on student self-reports and small sample size.

Each of the studies described above peripherally supports a relationship between writing and critical thinking. Although not explicitly an investigation of critical thinking, results from a relatively recent study support a stronger connection between writing and reasoning ability ( Daempfle, 2002 ). Ebert-May et al. (1997) used a modified learning cycle instructional method and small group collaboration to increase reasoning ability in general education biology students. A quasi-experimental pretest/posttest control group design was used on a comparatively large sample of students, and considerable thought was given to controlling extraneous variables across the treatment and comparison groups. A multifaceted assessment strategy based on writing, standardized tests, and student interviews was used to quantitatively and qualitatively evaluate student content knowledge and thinking skill. Results indicated that students in the treatment group significantly outperformed control group students on reasoning and process skills as indicated by the National Association of Biology Teachers (NABT) content exam. Coincidentally, student content knowledge did not differ significantly between the treatment and control sections, indicating that development of thinking skill did not occur at the expense of content knowledge ( Ebert-May et al., 1997 ). Interview data indicated that students experiencing the writing and collaboration-based instruction changed how they perceived the construction of biological knowledge and how they applied their reasoning skills. Although the Ebert-May study is one of the more complete investigations of writing and critical thinking to date, several questions remain. Supporting validity and reliability data for the NABT test was not included in the study, making interpretation of results somewhat less certain. In addition, the NABT exam is designed to assess high school biology performance, not college performance ( Daempfle, 2002 ). Perhaps more importantly, the NABT exam does not explicitly measure critical thinking skills.

Collectively, it appears that additional research is necessary to establish a more defined relationship between writing and critical thinking in science ( Rivard, 1994 ; Tsui, 1998 , 2002 ; Daempfle, 2002 ). The current study addresses some of the gaps in previous work by evaluating the effects of writing on critical thinking performance using relatively large numbers of students, suitable comparison groups, valid and reliable instruments, a sizable cadre of covariables, and statistical analyses of covariance. This study uses an experimental design similar to that of the Ebert-May et al. (1997) study but incorporates valid and reliable test measures of critical thinking that can be used both within and across different science disciplines.

Purpose of the Study

Currently there is much national discussion about increasing the numbers of students majoring in various science fields ( National Research Council, 2003 ; National Academy of Sciences, 2005 ). Although this is a necessary and worthwhile goal, attention should also be focused on improving student performance in general education science because these students will far outnumber science majors for the foreseeable future. If college instructors want general education students to think critically about science, they will need to use teaching methods that improve student critical thinking performance. In many traditional general education biology courses, students are not expected to work collaboratively, to think about concepts as much as memorize facts, or to develop and support a written thesis or argument. This presents a large problem when one considers the societal role that general education students will play as voters, community members, and global citizens. By improving their critical thinking skills in science, general education students will be better able to deal with the broad scientific, economic, social, and political issues they will face in the future.

The problem addressed by this study was to discover whether writing could improve student critical thinking performance in general education biology courses. How might writing in general education biology affect the analysis, inference, and evaluation skills that are inherent to critical thinking? What level of critical thinking skill do students bring to nonmajors biology courses? Can their critical thinking skills be measurably improved using writing? What other factors affect development of critical thinking skills? When do student critical thinking skills begin to change, and how much? In this study, the effect of writing on critical thinking performance was investigated using the California Critical Thinking Skills Test (CCTST) at the beginning (pretest) and end (posttest) of 10 sections of general education biology at a regional comprehensive university in the Pacific Northwest. Several research questions framed this investigation:

Does writing in laboratory affect critical thinking performance in general education biology? Does the development of analysis, inference, and evaluation skills differ between students who experience writing versus those who experience traditional laboratory instruction? What measurable effect do factors like gender, ethnicity, and prior thinking skill have on changes in critical thinking in general education biology? If critical thinking skills change during an academic quarter, when does that take place?

MATERIALS AND METHODS

Study context.

The study took place at a state-funded regional comprehensive university in the Pacific Northwest. All participants were nonmajor undergraduates who were taking biology to satisfy their general education science requirement. Ten total sections of general education biology offered over three academic quarters (one academic year) were included in the study. Four of the 10 sections implemented a writing component during weekly laboratory meetings (N = 158); six traditional quiz-based laboratory sections served as a nonwriting control group (N = 152). Only scores from students who had completed both the initial (pretest) and end-of-quarter (posttest) critical thinking assessments were included in the data analysis. A breakdown of participant demographics for the writing and nonwriting groups is provided in Table 1 .

Demographics for the writing and nonwriting groups

SampleClass distribution (%) Gender distribution (%)
FrSoJrSr2nd SrMF
Writing (158)44.933.515.23.82.538.661.4
No writing (152)53.328.37.29.22.038.261.8
Overall (310)49.031.011.36.52.338.461.6
SampleEthnic distribution (%)
CaucasianHispanicAfrican AmericanNative AmericanAsianOther
Writing (158)84.81.92.504.46.3
No writing (152)81.64.61.31.35.95.3
Overall (310)83.23.21.90.65.25.8

Demographics profile for the study sample. n values in parentheses.

a Other includes the ″choose not to answer″ response.

Each course section included a lecture component offered four times per week for 50 min and a laboratory component that met once a week for 2 h. Course lecture sections were limited to a maximum enrollment of 48 students, with two concurrent lab sections of 24 students. Two different instructors taught five writing sections and five other instructors taught 11 traditional sections over three consecutive quarters. Each course instructor materially participated in teaching laboratory with the help of one graduate assistant per lab section (two graduate students per course section). None of the instructors from treatment sections had implemented writing in the laboratory before the start of this study. Writing instructors were chosen on the basis of personal dissatisfaction with traditional laboratory teaching methods and willingness to try something new.

Strong efforts were made to establish equivalency between writing and nonwriting course sections a priori. Course elements that were highly similar included common lecture rooms, the use of similar (in most cases identical) textbooks, and a lab facility coordinated by a single faculty member. More specifically, three similarly appointed lecture rooms outfitted with contemporary instructional technology including dry erase boards, media cabinets, a networked computer, and digital projection were used to teach the nonmajors biology courses. The same nonmajors biology textbook was used across the writing and most of the nonwriting sections. All laboratory sections used a common lab facility and were taught on the same day of the week. Although the order in which specific labs were taught differed among sections, a common laboratory manual containing prescriptive exercises covering the main themes of biology (scientific method, cellular biology and genetics, natural selection and evolution, kingdoms of life, and a mammalian dissection) was used across all writing and nonwriting lab sections.

Primary course differences included a writing component in the laboratory, and how much time was devoted to laboratory activities. Those sections that experienced the writing treatment completed the prescriptive lab exercises in the first hour and engaged in writing during the second hour of the lab. Nonwriting sections allocated 2 h for the prescriptive lab exercises and included a traditional laboratory quiz rather than a writing assignment. The degree to which the writing and nonwriting sections included small group collaboration in laboratory varied and all course sections differed with regards to individual instructor teaching style. Although all course sections used traditional lecture exams during the quarter to assess content knowledge, the degree to which rote memorization-based exam questions were used to evaluate student learning varied.

Description of the Writing Treatment

On the first day of lecture, students in the writing treatment group were told that their laboratory performance would be evaluated using collaborative essays instead of traditional quizzes. A brief overview of the writing assignments was included in associated course syllabi. During the first laboratory session of the quarter, students were grouped into teams of three or four individuals, and the criteria for completing weekly writing assignments were further explained.

The decision to use collaborative groups to support writing in the laboratory was partly based on existing literature ( Collier, 1980 ; Bruffee, 1984 ; Tobin et al., 1994 ; Jones and Carter, 1998 ; Springer et al., 1999 ) and prior research by Quitadamo, Brahler, and Crouch (unpublished results), who showed that Peer Led Team Learning (one form of collaborative learning) helped to measurably improve undergraduate critical thinking skills. Small group learning was also used in the nonwriting treatment groups to a greater or lesser extent depending on individual instructor preference.

Baseline critical thinking performance was established in the academic quarters preceding the writing experiment to more specifically attribute changes in critical thinking to the writing treatment. Concurrent nonwriting course sections were also used as comparison groups. The historical baseline provided a way to determine what student performance had been before experiencing the writing treatment, whereas the concurrent nonwriting groups allowed for a direct comparison of critical thinking performance during the writing treatment. Pretest scores indicating prior critical thinking skill were also used to further establish comparability between the writing and nonwriting groups.

Laboratory activities were coordinated for all sections by a single faculty member who taught in the nonwriting group. All faculty and graduate assistants met regularly to discuss course progress, laboratory procedure, and coordinate resources. Nonwriting faculty drafted quizzes that addressed laboratory content knowledge. Writing faculty collaboratively crafted a consensus essay, or thought question, designed to elicit student critical thinking and ability to apply content knowledge. Each thought question was designed so that students had to apply lecture concepts and build on their conceptual understanding by integrating actual laboratory experiences (see Supplemental Appendix 1 , available online) for thought question examples). Weekly thought questions became progressively more difficult as the term progressed. Initial planning meetings took place just before the beginning of the academic quarter and included graduate assistant training to help them learn to consistently evaluate student writing using a modified thesis-based essay rubric (see Supplemental Appendix 2 ; Beers et al., 1994 ). A range of sample essays from poor to high quality was used to calibrate graduate assistant scoring and ensure consistency between assistants from different laboratory sections within the writing group. All graduate assistants and course instructors applied the thesis-based rubric to sample essays and worked toward consensus. Initial training ended when all graduate assistants scored within 0.5 points of each other on at least two sample essays.

Students were given weekly thought questions before beginning laboratory to help them frame their efforts during laboratory exercises. Students completed the prescriptive lab activities during the first hour, and then each student group relocated to an assigned computer lab in the same building and worked around a common computer terminal to draft a collective response to the weekly thought question. Students were allowed to use any suitable information or materials (laboratory observations, laboratory manuals, lecture notes, textbooks, the Internet, etc.) to help them address their thought question. Internal group discussions allowed students to argue individual viewpoints as they worked toward group agreement on each thought question. Essay responses to thought questions were answered using a standard five-paragraph format. Each essay included an introduction with a group-generated thesis statement, two to three body paragraphs that provided sufficient detail to support the thesis statement, and a summary paragraph that concluded the essay. Students were not allowed to work on essays outside of the laboratory environment.

Initial essay drafts were composed in Microsoft Word and submitted to the graduate assistant by the end of the laboratory period using the campus e-mail system. Graduate assistants evaluated each group's essay (typically six per lab section) and assigned an initial grade based on the thesis-based essay rubric. Graduate assistants made comments and suggestions electronically using Microsoft Word revising and track changes tools. Evaluated essays were e-mailed back to each student group, which addressed comments and suggestions during the subsequent week's laboratory writing time. Each student group submitted a final draft that was re-evaluated and assigned a final grade. During the second week, students both revised their essay from the previous week and then generated an initial draft for the current week's thought question, all within the lab writing hour. This was done to help students become more proficient writers within a short period of time. Overall, students in the writing group completed eight essays that, along with lab book scores, constituted 25% of their overall course grade. An identical percentage was used to calculate traditional quiz and lab book scores in all nonwriting course sections.

At the end of the quarter, each writing group member completed a peer evaluation for all group members, including themselves (see Supplemental Appendix 3 ). This was done to help students reflect on and evaluate their own performance, maximize individual accountability within the group, and make sure students received credit proportional to their contributions. The average peer evaluation score for each student was included as 5% of the final course grade.

Collectively, this approach to writing and evaluation was used to 1) help students reflect on and discuss deficiencies in their collective and written work, 2) provide an opportunity for students to explicitly address deficiencies in thesis development and general writing skill, 3) provide a suitable reward for student efforts to revise their work relative to established performance benchmarks, 4) improve individual accountability within each group, and 5) help students develop more efficient and effective writing skills that collectively might lead to improved critical thinking skill.

Assessment of Critical Thinking

Using critical thinking to indicate student learning performance is particularly useful because it can be measured within and across disciplines. Various instruments are available to assess critical thinking ( Watson and Glaser, 1980 ; Ennis and Weir, 1985 ; Facione, 1990b ; Center for Critical Thinking and Moral Critique, 1996 ); however, only the CCTST measures cognitive and meta-cognitive skills associated with critical thinking, is based on a consensus definition of critical thinking, and has been evaluated for validity and reliability for measuring critical thinking at the college level ( Facione, 1990a ; Facione et al., 1992 , 2004 ). The CCTST measures cognitive skills of analysis, inference, evaluation, induction, and deduction, with results expressed as raw scores or national percentile equivalents based on a national norming sample of students from 4-yr colleges and universities. Construct validity for the CCTST is high as indicated by greater than 95% consensus of the Delphi panel experts on the component skills of critical thinking. Test reliability (calculated using the KR–20 internal consistency method) is 0.78–0.84 for the form used in this study, a value considered to be within the recommended range for tests that measure a wide range of critical thinking skills ( Facione, 1991 ). The CCTST norming sample for 4-yr colleges and universities is based on a stratified sample of 2000 students from various disciplines, with approximately 30% of the norming sample comprised of science and math students. Approximately 20,000 college students complete the CCTST each year ( Insight Assessment and Blohm, 2005 ).

The CCTST contains 34 questions and is a 45-min timed assessment of critical thinking. An online version of the CCTST was administered in this study, which allowed the researchers to collect student demographics data including gender, ethnicity, age, and several others at the same time critical thinking skill was measured. Total critical thinking skill as well as analysis, inference, and evaluation component critical thinking skills ( Facione, 1990c ) were determined for each CCTST administration and compared across the writing and nonwriting groups.

Research Design

A quasi-experimental pretest/posttest control group design was used for this study to determine whether critical thinking performance in the writing group differed significantly from the nonwriting group. This design was chosen in order to compare critical thinking performance between intact groups, and because it was not feasible to randomly assign students from one course section to another within the sample. Frequency distributions of pretest/posttest changes in total critical thinking skill and analysis, inference, and evaluation component critical thinking skills were constructed to provide some indication of sample randomness and to inform assumptions for subsequent statistical analyses of covariance (see Figure 1 , A–D).

An external file that holds a picture, illustration, etc.
Object name is cbe0020700740001.jpg

(A–D) Frequency distribution of change in critical thinking skills. Distribution of change in critical thinking skill for the experimental sample. Changes are indicated using raw scores from CCTST pre- and posttests for total critical thinking skill (A) as well as analysis (B), inference (C), and evaluation (D) component critical thinking skills.

The pretest/posttest control group design was also used in order to minimize internal validity threats that could potentially compete with the effects of the writing treatment on student critical thinking performance. This design is widely used in educational research, and generally controls for most threats to internal validity ( Campbell and Stanley, 1963 ). Internal threats that remain a concern include history, maturation, pretest sensitization, selection, and statistical regression toward the mean. In the current study, history and maturation threats were minimized to the extent that the CCTST pretest and posttest were administered only 9 wk apart, and class standing and age covariables that indicate maturation were included in the statistical analysis. Pretest sensitization and selection are larger concerns for this design. Pretest sensitization was minimized in several ways: 1) prior critical thinking skill indicated by the CCTST pretest was used as a covariable in statistical analyses, 2) pretest/posttest to posttest only comparison studies conducted by Insight Assessment indicate CCTST pretest sensitization is minimized ( Facione, 1990a ), and 3) neither the students, instructors, nor the test administrators have access to the correct answers on the CCTST, so repeat performance on the posttest is less likely. Selection threats were also reduced by using CCTST pretest scores in the statistical analyses, thereby making it more difficult to detect statistically significant differences in critical thinking performance between the writing and nonwriting groups. Statistical regression toward the mean, which was observed to some extent in this study, was minimized because this study used a valid and reliable instrument to assess critical thinking ( Facione, 1990a ). Regression threats were also minimized to the extent that students with higher initial scores regressed much less than students with lower initial scores.

The generalizability of study results is limited because all data were collected at a single university. Specific threats to external validity include selection-treatment interaction and treatment diffusion. These threats were minimized because writing was mandatory for all treatment group participants, thereby minimizing volunteer effects. Because the writing also took considerable student effort, it is less likely that treatment diffusion occurred. In summary, the pretest/posttest control group design was used to minimize internal and external validity threats and maximize the ability to determine the effects of writing on student critical thinking performance.

Study Variables and Data Analysis

Effect of writing on critical thinking performance..

General education biology students were divided into writing and nonwriting groups (independent variable). Changes in CCTST pretest/posttest scores (dependent variable) were determined to discover whether writing influenced student critical thinking performance. Two CCTST outcome measures were used to statistically test for writing effect: 1) raw scores for total critical thinking skill, and 2) raw scores for analysis, inference, and evaluation component skills. Results were reported using raw scores and corresponding national percentile rank so that critical thinking performance outcomes would be more meaningful and intuitive. Conversion of CCTST raw scores to national percentile ranking was done using SPSS (SPSS, Inc., Chicago, IL) statistical software and a linear estimation conversion script based on an equivalency scale from Insight Assessment (Millbrae, CA).

Several covariables were included in the analysis to increase statistical accuracy and precision, and to more specifically isolate the effects of writing on critical thinking performance. CCTST pretest scores were used to indicate initial critical thinking skill. Gender and ethnicity helped to account for male/female or race-specific changes in critical thinking performance and were also used to identify potential sources of performance bias. Academic term and time of day were used to account for critical thinking differences due to the time of year each course was offered and the time of day each student took the course, respectively. Class standing and age were used to indicate maturation related to time in college and chronological age, respectively. Finally, the instructor covariable was used to account for performance differences due to individual teaching styles.

Statistical Analysis of Effect of Writing.

Several statistical analyses were conducted to determine the effects of writing on critical thinking performance in general education biology. An analysis of covariance (ANCOVA) test provided insight regarding differences in overall critical thinking performance between the writing and nonwriting groups. Change in CCTST total raw scores and national percentile ranking was used as composite measures of critical thinking ( Facione, 1990c ) in this initial analysis. Second, changes in particular component critical thinking skills (analysis, inference, and evaluation) were evaluated using a multivariate analysis of covariance (MANCOVA) test because of the three dependent variables. The ANCOVA and MANCOVA tests also provided some insight into the effect the covariables had on critical thinking performance in general education biology. Collectively, these statistical tests allowed for a more accurate and precise analysis because variance associated with the covariables could be more specifically isolated from the writing treatment. Mean, SE, and effect size were also compared between the writing and nonwriting groups. Effect size, represented in standard units, was used to compare the magnitude of writing effect in the study.

Analysis of Thought Question Performance.

Performance on weekly thought questions was analyzed to discover specifically when and how much student critical thinking skills changed during the academic term. This analysis also provided context for CCTST critical thinking performance measures. Specifically, average scores from a representative sample of writing course sections (approximately 100 students) were used to compare initial essay drafts across the weeks of the term to discover when students began to show changes in their first attempt at each essay. Weekly performance on final revised essays was also compared to determine how student final submissions changed over time. Finally, the weekly difference between each initial essay and each final essay was compared to determine how much the revision process changed during the term. These calculations collectively helped to provide a profile of critical thinking performance over time.

Participant Demographics

Student demographics provided in Table 1 indicated an overall distribution of approximately 49% freshmen, 31% sophomores, 11% juniors, and 9% seniors. Approximately 74% of the writing group students were freshmen and sophomores, whereas 82% of the nonwriting group was underclassmen. Overall, 61% of the sample was female and 39% male, with near identical gender distribution across the writing and nonwriting groups. The predominant ethnicity in the sample was Caucasian (>83%), with Asian American (5%), Latino/Hispanic (3%), African American (2%), and Native American (1%) students comprising the remainder of the sample. About 6% of the sample classified themselves as having some other ethnicity or chose not to identify their ethnic heritage.

Statistical Assumptions

Analysis of covariance and multivariate analysis of covariance tests were used to compare critical thinking performance between the writing and nonwriting groups. The evaluated assumptions for the ANCOVA and MANCOVA tests were homogeneity of slopes, homogeneity of covariances, and normality. An analysis evaluating the homogeneity of slopes assumption indicated that the relationship between the covariables and the critical thinking performance dependent variable did not differ significantly by the writing/nonwriting independent variable for the ANCOVA test, F(2, 307) = 1.642, p = 0.195, power = 0.346, partial η 2 = 0.011, or the MANCOVA test, F(6, 610) = 1.685, p = 0.122, power = 0.645, partial η 2 = 0.016. These results confirmed that both analyses of covariance met the homogeneity of slopes assumption. The homogeneity of covariance assumption was tested using Levene's and Box's tests. Levene's test results for the ANCOVA indicated that error variances were not equal across writing and nonwriting groups, F(1,308) = 7.139, p = 0.008. Similarly, Box's test results indicated that covariance was not equal for the writing and nonwriting groups, F(6, 684,530) = 4.628, p = 0.000. These results indicated that the ANCOVA/MANCOVA tests did not meet the homogeneity of covariance assumption. To more fully evaluate this assumption, distributions of total and component critical thinking skill were constructed (see Figure 1 , A–D). Furthermore, the writing and nonwriting groups were highly similar in size and no post hoc tests were conducted. On the basis of these data, it was determined that the ANCOVA and MANCOVA tests were the best statistical measures to answer the research questions. Finally, the normality assumption was evaluated using the previously constructed frequency distributions for total change in critical thinking ( Figure 1 A) as well as change in analysis ( Figure 1 B), inference ( Figure 1 C), and evaluation ( Figure 1 D) critical thinking skills. Frequency distributions of total and component critical thinking dependent variables indicated that each approximated a standard normal curve.

Effect of Writing on Total Critical Thinking Performance

The ANCOVA test of total critical thinking performance showed that writing and nonwriting groups differed significantly, F(1, 300) = 19.357, p < 0.0001, power = 0.992, partial η 2 = 0.061 (see Table 2 ). The strength of the relationship between the writing/nonwriting groups and critical thinking performance was modest but significant, accounting for more than 6% of the variance in critical thinking performance.

ANCOVA results for total critical thinking performance

TreatmentFdf PowerEffect size
Writing19.3573000.000 0.9920.061
CCTST pretest19.7133000.000 0.9930.062
Instructor7.7453000.006 0.7920.025
Time of day6.2913000.013 0.7050.021
Gender0.2263000.6350.0760.001
Ethnicity2.3263000.1280.3300.008
Age0.4533000.5020.1030.002
Class standing0.0023000.9620.0500.000
Academic term2.3873000.1230.3380.008

Analysis of covariance for the writing and nonwriting groups. Tested covariables included gender, ethnicity, class standing, age, prior critical thinking skill (CCTST pre-test), academic term, time of day, and instructor.

a Significance tested at 0.05 level.

Descriptive statistics of total critical thinking performance in the writing and nonwriting groups were also calculated (see Table 3 ). The writing group showed an average CCTST raw score change of 1.18 compared with the nonwriting group, which showed an average raw score change of −0.51. These critical thinking raw scores equated to gains in national percentile rank of 7.47 (45th to 53rd percentile) for the writing group and −2.09 (42nd to 40th percentile) for the nonwriting group. Critical thinking improvement in the writing group was approximately nine times greater than the nonwriting group (see Figure 2 ).

Writing effect on total critical thinking performance: CCTST raw scores

TreatmentMean raw score SEM Raw CT change
PrePostPrePost
Writing (158)15.8417.020.320.361.18
Nonwriting (152)15.4614.950.340.43−0.51
Overall (310)15.6516.000.330.400.34

Comparison of writing and nonwriting group performance based on CCTST raw scores. CCTST raw score range was 0–34; n values in parentheses.

An external file that holds a picture, illustration, etc.
Object name is cbe0020700740002.jpg

Effect of writing on total critical thinking national percentile rank. Comparison of total critical thinking national percentile gains between writing and nonwriting groups. Percentile ranking was computed using CCTST raw scores, an equivalency scale from Insight Assessment, and a linear conversion script in SPSS.

The ANCOVA test of total critical thinking skill indicated that gender, ethnicity, age, class standing, and academic term did not significantly affect critical thinking performance (see Table 2 ). Covariables that significantly affected total critical thinking performance included 1) CCTST pretest score, F(1, 300) = 19.713, p < 0.0001, power = 0.993, partial η 2 = 0.062, 2) instructor, F(1, 300) = 7.745, p < 0.006, power = 0.792, partial η 2 = 0.025, and 3) time of day, F(1300) = 6.291, p < 0.013, power = 0.705, partial η 2 = 0.021. The effect of prior critical thinking skill (CCTST pretest) was moderately strong, accounting for more than 6% of the variance in total critical thinking performance. The effect of instructor and time of day were smaller, accounting for 2.5 and 2%, respectively, of total critical thinking performance variance. Critical thinking improvement associated with CCTST pretest score was approximately 2.5 times greater than for instructor and nearly three times greater than for time of day.

Effect of Writing on Component Critical Thinking Performance

The MANCOVA test indicated that analysis, inference, and evaluation critical thinking skills differed significantly between the writing and nonwriting groups, Wilks λ = 0.919, F(3, 296) = 8.746, p < 0.0001, power = 0.995, partial η 2 = 0.081 (see Table 4 ). The strength of the relationship between writing and component critical thinking performance was modest but significant, accounting for more than 8% of the variance in critical thinking performance.

MANCOVA results for component critical thinking performance

TreatmentWilks λFdf PowerEffect size
Writing0.9199.7462960.0000.9950.081
Analysis pretest0.62359.7372960.0001.0000.377
Inference pretest0.68146.2222960.0001.0000.319
Evaluation pretest0.61362.3982960.0001.0000.387
Gender0.9841.6022960.1890.4200.016
Ethnicity0.9831.7562960.1560.4560.017
Age0.9881.1532960.3280.4560.012
Class standing0.9782.1862960.0900.5530.022
Instructor0.9564.5082960.0040.8800.044
Quarter0.9910.8992960.4420.2460.009
Time of day0.9802.0222960.1110.5170.020

Multivariate analysis of covariance for the writing and nonwriting groups. Tested covariables included gender, ethnicity, class standing, age, prior critical thinking skill (CCTST pretest), academic term, time of day, and instructor.

Specifically, significant gains in analysis and inference skills were observed in the writing group but not the nonwriting group. No statistically significant gains in evaluation skill were observed in either group (see Table 5 ). National percentile rank equivalents for CCTST component raw scores indicated the writing group gained 10.51 percentile in analysis skill (42nd to 52nd percentile), 6.05 percentile in inference skill (45th to 52nd percentile), and 5.16 percentile in evaluation skill (46th to 52nd percentile). The nonwriting group showed a national percentile rank change of −4.43 percentile in analysis skill (47th to 42nd percentile), −2.23 percentile in inference skill (42nd to 40th percentile), and 1.37 percentile in evaluation (44th to 45th percentile; see Figure 3 ). Critical thinking performance for the writing group was 15 times greater for analysis and 8 times greater for inference skills than for the nonwriting group. Although neither the writing nor the nonwriting group showed significant gains in evaluation skill, the writing group showed more than 3 times greater improvement than did the nonwriting group.

Effect of writing on component critical thinking performance

Component skillMean raw score and change
Writing (n = 158) Nonwriting (n = 152)
Raw scoreSEMRaw scoreSEM
Analysis (pre)4.220.114.350.11
Analysis (post)4.540.103.990.13
Analysis (change)0.33 0.11−0.360.14
Inference (pre)7.420.187.070.20
Inference (post)7.910.206.830.23
Inference (change)0.48 0.16−0.240.21
Evaluation (pre)4.200.144.040.15
Evaluation (post)4.570.154.130.17
Evaluation (change)0.370.150.090.17

Comparison of writing and nonwriting group performance based on critical thinking component skill raw scores (CCTST subscales). Score range was 0–7 (analysis), 0–16 (inference), and 0–11 (evaluation).

An external file that holds a picture, illustration, etc.
Object name is cbe0020700740003.jpg

Effect of writing on component critical thinking national percentile rank. Comparison of component critical thinking national percentile gains between writing and nonwriting groups. Percentile ranking was computed using CCTST raw scores, an equivalency scale from Insight Assessment, and a linear conversion script in SPSS.

The MANCOVA test of analysis, inference, and evaluation skills indicated that gender, ethnicity, age, class standing, academic term, and time of day did not significantly affect critical thinking performance. Critical thinking performance was affected by prior analysis, inference, and evaluation skill (CCTST component pretest scores) and instructor (see Table 4 ). Specifically, component pretest scores had a large effect on critical thinking, accounting for 38% (analysis), 32% (inference), and 39% (evaluation) of critical thinking performance variance. The effect of instructor was smaller, accounting for 4.4% of variation in critical thinking skill. The effect of prior component critical thinking skill was approximately 4.5 times greater than the effect of writing, and nearly 9 times greater than the effect of instructor.

Student Thought Question Performance

Critical thinking performance on student essays was evaluated by applying a thesis-based essay rubric (see Supplemental Appendix 2 ) on initial submissions and final revised essays. Average weekly performance during the academic term is shown in Figure 4 . A comparison of initial essays indicated that students improved 53.3% from week 1 (average score of 27.9%) to week 7 (average score of 81.2%). A similar comparison of final essays showed that students improved 32.5% from week 1 (average score of 54.1%) to week 7 (average score of 86.6%). The largest changes between initial and final essays occurred in week 1 (change of 26.2%), and decreased each week thereafter (24.8, 23.9, 18.8, 8, 7.8, and 5.4% for weeks 2 through 7, respectively). These results showed that students produced little evidence of critical thinking skill in their writing early in the term, but improved dramatically on both initial and revised essay submissions by the end of the term.

An external file that holds a picture, illustration, etc.
Object name is cbe0020700740004.jpg

Profile of change in critical thinking performance in writing group. Comparison of student writing performance on weekly initial and revised essays. Essay scores were derived using a thesis-based critical thinking rubric (see Supplemental Appendix 2 ). Average essay scores were computed across writing sections.

The purpose of this study was to discover whether writing could measurably influence critical thinking performance in general education biology. Results indicated that students from the writing group significantly outperformed their nonwriting peers in both total critical thinking skill and the component critical thinking skills of analysis and inference. The writing and nonwriting groups were highly similar initially and began the academic term with comparable critical thinking ability (45th and 42nd national percentile for writing and nonwriting, respectively). By the end of the term, writing students had improved their critical thinking skill to above the 52nd percentile whereas nonwriting students decreased to below the 40th percentile. In addition to writing, prior critical thinking skill and course instructor significantly affected critical thinking performance, with prior critical thinking skill having the largest effect on critical thinking gains of any variable tested. Further analysis of the writing group showed that the largest gains in critical thinking occurred during the first few weeks of the term, with graduated improvement during the remainder of the term. A comparison of average critical thinking performance on initial essays and revised essays showed that thinking skills improvement was greater on initial essays (53%) than on final essays (33%). Collectively, the results of this study indicated that students who experienced writing in general education biology significantly improved their critical thinking skills.

The covariance analysis that was conducted provided a partial means to separate out the effects of writing, prior critical thinking skill, instructor, and multiple covariables from total and component critical thinking gains. The analysis of total critical thinking skill indicated that writing students changed their critical thinking skill from below the national average to above the national average within an academic quarter, whereas nonwriting students remained below the national average. This observation is important because it shows that students can develop critical thinking skills within a fairly short 9-wk period of time, and that writing can play a role in that process. A similar study showed critical thinking skills improve over 15 wk (Quitadamo, Brahler, and Crouch, unpublished results); however, this study provided no insight into whether critical thinking skills could be changed over a shorter period of time, in a different academic setting, or in response to instructional variables such as writing.

Although critical thinking gains were influenced by writing, they did not appear to be affected by gender, ethnicity, class standing, or age. In fact, statistical results indicated that these variables collectively had a very small effect on critical thinking performance. Gender distribution was nearly identical across the writing and nonwriting groups, and was predominantly female (nearly 62%). Ethnic distribution was also highly similar across the writing and nonwriting groups, but the sampling was largely Caucasian (>84%). Class standing varied a little more across the writing and nonwriting groups, with the sample largely comprised of underclassmen (70%). Although nearly three-quarters of the sample was between 18 and 21 years of age, nearly 10% was over 21, with a fair number of older nontraditional students represented. It is possible that a more diverse sample would have produced different results, or it may be that the individuals participating in this study responded particularly well to writing. Although further investigation of these variables is necessary and important, it was beyond the scope of the current study.

The analysis of component skills provided greater insight into the particular critical thinking skills that students changed in response to writing. Specifically, writing students significantly improved their analysis and inference skills whereas nonwriting students did not. Writing students also improved their evaluation skills much more than nonwriting students, although not significantly. These results indicate that the process of writing helps students develop improved analytical and inference skills. Prior research indicates that the writing to learn strategy is effective because students must conceptually organize and structure their thoughts as well as their awareness of thinking processes ( Langer and Applebee, 1987 ; Ackerman, 1993 ; Holliday, 1994 ; Rivard, 1994 ). More specifically, as students begin to shape their thoughts at the point of construction and continually analyze, review, and clarify meaning through the processes of drafting and revision, they necessarily engage and apply analysis and inference skills ( Klein, 1999 ; Hand and Prain, 2002 ). In this study, the process of writing appears to have influenced critical thinking gains. It also seems likely that writing students experienced a greater cognitive demand than nonwriting students simply because the writing act required them to hypothesize, debate, and persuade ( Rivard, 1994 ; Hand and Prain, 2002 ) rather than memorize as was the case in nonwriting control courses.

Conversely, the lack of any significant change in analysis, inference, or evaluation skills in the nonwriting group indicated that the traditional lab instruction used in the general education biology control courses did not help students develop critical thinking skills. Based on the results of this study, it could be argued that traditional lab instruction actually prevents the development of critical thinking skills, which presents a rather large problem when one considers how frequently these traditional methods are used in general education biology courses. One also has to consider that the critical thinking gains seen in the writing group might also have resulted from the relative absence of traditional lab instruction rather than writing alone. Additional research will be necessary to gain further insight into this question. Either way, changes to the traditional model of lab instruction will be necessary if the goal is to enhance the critical thinking abilities of general education biology students.

The variable that had the largest impact on critical thinking performance gains was prior critical thinking skill. This phenomenon was previously observed by Quitadamo, Brahler, and Crouch (unpublished results) in a related study that investigated the effect of Peer Led Team Learning on critical thinking performance. That study focused on science and math major undergraduate critical thinking performance at a major research university, and found that, in addition to Peer Led Team Learning, prior critical thinking skill significantly influenced critical thinking performance (Quitadamo, Brahler, and Crouch, unpublished results). Specifically, students with the highest prior critical thinking skill showed the largest performance gains, whereas students with low initial skill were at a comparative disadvantage. The fact that prior critical thinking skill also had a large effect on critical thinking performance in this study increases the generalizability of the observation and underscores its importance. Simply put, students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills, not because they lack the cognitive hard-wiring to perform but because they lack the tools to build their knowledge. Is it reasonable or just to expect otherwise comparable students to perform at similar levels when only some of them have the keys for success? If we hope to improve the perception of science in this country, we need to educate people on how to think about important scientific issues, and not simply argue a position based on one school of thought. By helping general education students to develop critical thinking skills, it is hoped that they will be better able to think rationally about science.

The observation that students who come to general education biology with greater critical thinking skills leave with the largest skill gains has important implications for the K–12 school system as well. If a high proportion of students are coming to institutions of higher education lacking critical thinking skills, why are these skills not being explicitly taught in the K–12 system? Ideally, students would learn the foundational tenets of critical thinking at an earlier age, and be able to refine and hone these skills as they progress through the K–20 education system. The results of this study reinforce the idea that students should be explicitly taught critical thinking skills and be expected to practice them as early and often as possible.

Although its effect was smaller than writing or prior critical thinking skill, the instructor variable also played a significant role in student critical thinking performance, accounting for 2.5% of the total variance in critical thinking gains. Determining the particular qualities of each instructor that contributed to student critical thinking success and further separating instructor and writing effects will require additional research. Previous research indicates that teaching style positively influences certain aspects of student learning ( Grasha, 1994 ; Hativa et al., 2001 ; Bain, 2004 ), but the qualities that specifically influence student critical thinking gains have not been sufficiently investigated. Additional research in this area is necessary.

Faculty considering whether to use writing in the laboratory may wonder about how much time and energy it takes to implement, if efforts to change will translate into improved student learning, and how these changes affect disciplinary content. From a practical perspective, implementing writing did not take more time and effort per se; rather, it required faculty to reconceptualize how they spent their instructional time. Instead of individually developing course materials, writing faculty collaborated to a greater extent than nonwriting faculty on course design and assessments that required students to demonstrate their critical thinking skill. Interviews of faculty from the writing and nonwriting groups indicated that writing faculty felt the course was less work because they collaborated with colleagues and because students demonstrated improved thinking skill. Writing faculty generally became more comfortable with the new model after ∼2–3 wk when students began to show observable changes in writing proficiency and critical thinking. Together, collaboration with colleagues and observed gains in critical thinking tended to create a positive feedback loop that helped to sustain writing faculty efforts. In contrast, nonwriting faculty similarly wanted their students to think better but were convinced that traditional methods would be more effective, and so remained closed to change. There were some logistical challenges with writing, like scheduling computer labs where students could draft and revise their weekly essay responses under instructor and teaching assistant supervision. Teaching assistants (and faculty) also needed to be trained on how to evaluate writing using a rubric. Finally, with regards to content coverage, no lecture or laboratory content was killed in order to implement writing because writing and nonwriting students both performed the same lab activities. Collectively, the benefits of using writing in laboratory should encourage faculty who want their students to learn to think critically to give it a try.

Future Directions

This study showed that writing affects student critical thinking skill in a nonmajors biology course, but the results have generated more questions than have been answered. How does writing specifically produce gains in critical thinking performance? What factors influence student prior critical thinking skill? How do instructors specifically influence student gains in critical thinking? Future studies that analyze student essays in more detail would provide greater insight into how writing influences critical thinking skill. Using writing in other nonmajor science courses such as chemistry, geology, or physics could also be done to determine the transferability of this method. Additional studies that investigate student prior critical thinking skill and instructor variables are also necessary. These future studies would further contribute to the knowledge base in this area, and also address some of its identified limitations ( Ebert-May et al., 1997 ; Daempfle, 2002 ). Results from these studies would also increase the generalizability of the results from this study.

CONCLUSIONS

Building on existing research and on the basis of several lines of evidence presented in this study, we conclude that writing positively influences critical thinking performance for general education biology students. Those students with prior critical thinking skill may have a comparative advantage over other general education biology students who have not developed these same skills. To rectify that inequity critical thinking skills should be explicitly taught early and used often during the K–20 academic process. As it appears that particular instructors improve student critical thinking skills more than others, students should be discerning in their choice of instructors if they want to improve their critical thinking skills. Whether writing as a method to improve critical thinking skills will prove useful in other general education science courses will likely depend on a host of factors, but it has potential. Further study of writing in general education science will be necessary to verify these results and discover the breadth and depth of how writing affects critical thinking skill.

ACKNOWLEDGMENTS

We thank Drs. Holly Pinkart, Roberta Soltz, Phil Mattocks, and James Johnson and undergraduate researchers Matthew Brewer, Dayrk Flaugh, Adam Wallace, Colette Watson, Kelly Vincent, and Christine Weller for their valuable contributions to this study. The authors also acknowledge the generous financial support provided by the Central Washington University Office of the Provost and the Office of the Associate Vice President for Undergraduate Studies.

  • Ackerman J. M. The promise of writing to learn. Writ. Commun. 1993; 10 (3):334–370. [ Google Scholar ]
  • American Association for the Advancement of Science. Washington, DC: 1989. Science for All Americans. A Project 2061 Report on Literacy Goals in Science, Mathematics, and Technology. [ Google Scholar ]
  • Applebee A. N. Writing and reasoning. Rev. Educ. Res. 1984; 54 (4):577–596. [ Google Scholar ]
  • Association of American Colleges Universities. Washington, DC: 2005. Liberal Education Outcomes: A Preliminary Report on Student Achievement in College. [ Google Scholar ]
  • Bain K. Cambridge, MA: Harvard University Press; 2004. What the Best College Teachers Do. [ Google Scholar ]
  • Beers T., McIssac C., Henderson B., Gainen J. Writing: thesis and support scoring guide. 1994. [accessed 25 August 2006]. http://www.insightassessment.com/pdf_files/RUB_WTHS.PDF .
  • Bruffee K. A. Collaborative learning and the “conversation of mankind.” Coll. Engl. 1984; 46 (7):635–653. [ Google Scholar ]
  • Business-Higher Education Forum, and American Council on Education. Washington, DC: 2003. Building a Nation of Learners: The Need for Changes in Teaching and Learning To Meet Global Challenges. [ Google Scholar ]
  • Bybee R. W., Fuchs B. Preparing the 21st century workforce: a new reform in science and technology education. J. Res. Sci. Teach. 2006; 43 (4):349–352. [ Google Scholar ]
  • Campbell D. T., Stanley J. C. Boston, MA: Houghton Mifflin Company; 1963. Experimental and Quasi-experimental Designs for Research. [ Google Scholar ]
  • Carnevale A. P. American Society for Training Development. San Francisco, CA: Jossey-Bass; 1990. Workplace Basics: The Essential Skills Employers Want. [ Google Scholar ]
  • Center for Critical Thinking and Moral Critique. Rohnert Park, CA: Sonoma State University; 1996. ICAT Critical Thinking Essay Test. [ Google Scholar ]
  • Champagne A., Kouba V., Mintzes J., Wandersee J., Novak J. Assessing Science Understanding: A Human Constructivist View. New York: Academic Press; 1999. Written product as performance measures; pp. 224–248. [ Google Scholar ]
  • Collier K. G. Peer-group learning in higher education: the development of higher order skills. Stud. High. Educ. 1980; 5 (1):55–61. [ Google Scholar ]
  • Daempfle P. A. New York: U.S. Department of Education; 2002. Instructional Approaches for the Improvement of Reasoning in Introductory College Biology Courses: A Review of the Research. [ Google Scholar ]
  • Ebert-May D., Brewer C., Allred S. Innovation in large lectures—teaching for active learning. Bioscience. 1997; 47 (9):601–607. [ Google Scholar ]
  • Ennis R. H. A logical basis for measuring critical thinking skills. Educ. Leadership. 1985; 43 (2):44–48. [ Google Scholar ]
  • Ennis R. H., Weir E. Pacific Grove, CA: Midwest Publications; 1985. The Ennis-Weir Critical Thinking Essay Test. [ Google Scholar ]
  • Facione P. A. Millbrae, CA: Insight Assessment; 1990a. The California Critical Thinking Skills Test—College Level. Technical Report 1. Experimental Validation and Content Validity. [ Google Scholar ]
  • Facione P. A. Millbrae, CA: Insight Assessment; 1990b. The California Critical Thinking Skills Test—College Level. Technical Report 3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST. [ Google Scholar ]
  • Facione P. A. Millbrae, CA: Insight Assessment; 1990c. The California Critical Thinking Skills Test—College Level. Technical Report 4. Interpreting the CCTST, Group Norms, and Sub-Scores. [ Google Scholar ]
  • Facione P. A. Millbrae, CA: Insight Assessment; 1991. Using the California Critical Thinking Skills Test in Research, Evaluation, and Assessment. [ Google Scholar ]
  • Facione P. A. American Philosophical Association. Millbrae, CA: Insight Assessment; 1990. Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Research Findings and Recommendations. [ Google Scholar ]
  • Facione P. A., Facione N. C., Giancarlo C. A. Millbrae, CA: Insight Assessment; 1992. Test Manual: The California Critical Thinking Disposition Inventory. [ Google Scholar ]
  • Facione P. A, Facione N. C. Insight Assessment. 2004. [accessed 30 June 2006]. Test of everyday reasoning. http://www.insightassessment.com/test-ter.html . [ Google Scholar ]
  • Giancarlo C. A., Facione P. A. A look across four years at the disposition toward critical thinking among undergraduate students. J. Gen. Educ. 2001; 50 (1):29–55. [ Google Scholar ]
  • Grasha A. F. A matter of style: the teacher as expert, formal authority, personal model, facilitator, and delegator. Coll. Teach. 1994; 42 (4):142–149. [ Google Scholar ]
  • Hand B., Prain V. Teachers implementing writing-to-learn strategies in junior secondary science: a case study. Sci. Educ. 2002; 86 (6):737–755. [ Google Scholar ]
  • Hativa N., Barak R., Simhi E. Exemplary university teachers: knowledge and beliefs regarding effective teaching dimensions and strategies. J. High. Educ. 2001; 72 (6):699–729. [ Google Scholar ]
  • Holliday W. G. The reading-science learning-writing connection: breakthroughs, barriers, and promises. J. Res. Sci. Teach. 1994; 31 (9):877–893. [ Google Scholar ]
  • Holmes J., Clizbe E. Facing the 21st century. Bus. Educ. Forum. 1997; 52 (1):33–35. [ Google Scholar ]
  • Holyoak A. R. A plan for writing throughout (not just across) the biology curriculum. Am. Biol. Teach. 1998; 60 (3):186–190. [ Google Scholar ]
  • Howard V. A. Thinking on paper: a philosopher's look at writing. In: Howard V. A., editor. Varieties of Thinking: Essays from Harvard's Philosophy of Education Research Center. New York: Routledge; 1990. pp. 84–92. [ Google Scholar ]
  • Insight Assessment. Blohm S. Annual number of users for the CCTST form 2000. 2005 [accessed 8 December 2006]; [ Google Scholar ]
  • Jones E. A., Hoffman S., Moore L. M., Ratcliff G., Tibbets S., Click B., III . Report no. NCES-95-001. University Park, PA: U.S. Department of Education, Office of Educational Research and Improvement.; 1995. National Assessment of College Student Learning: Identifying College Graduates' Essential Skills in Writing, Speech and Listening, and Critical Thinking. Final project report. [ Google Scholar ]
  • Jones G. M, Carter G. Small groups and shared constructions. In: Mintzes J. J., Wandersee J. H., Novak J. D., editors. Teaching Science for Understanding: A Human Constructivist View. San Diego, CA: Academic Press; 1998. pp. 261–279. [ Google Scholar ]
  • Kelly G. J., Chen C. The sound of music: constructing science as sociocultural practices through oral and written discourse. J. Res. Sci. Teach. 1999; 36 (8):883–915. [ Google Scholar ]
  • Keys C. W. Revitalizing instruction in scientific genres: connecting knowledge production with writing to learn in science. Sci. Educ. 1999; 83 (2):115–130. [ Google Scholar ]
  • Keys C. W., Hand B., Prain V., Collins S. Using the science writing heuristic as a tool for learning from laboratory investigations in secondary science. J. Res. Sci. Teach. 1999; 36 (10):1065–1084. [ Google Scholar ]
  • Klein P. Reopening inquiry into cognitive processes in writing-to-learn. Ed. Psychol. Rev. 1999; 11 (3):203–270. [ Google Scholar ]
  • Klein P. D. Constructing scientific explanations through writing. Instr. Sci. 2004; 32 (3):191–231. [ Google Scholar ]
  • Klymkowsky M. W. Can nonmajors courses lead to biological literacy? Do majors courses do any better? Cell. Biol. Educ. 2006; 4 :42–44. [ PubMed ] [ Google Scholar ]
  • Kokkala I., Gessell D. A. Writing science effectively: biology and English students in an author-editor relationship. J. Coll. Sci. Teach. 2003; 32 (4):252–257. [ Google Scholar ]
  • Kurfiss J. G. Association for the Study of Higher Education. Washington, DC: George Washington University; 1988. Critical Thinking: Theory, Research, Practice, and Possibilities. [ Google Scholar ]
  • Langer J. A., Applebee A. N. Learning to write: learning to think. Educ. Horizons. 1985; 64 (1):36–38. [ Google Scholar ]
  • Langer J. A., Applebee A. N. Urbana, IL: National Council of Teachers of English; 1987. How Writing Shapes Thinking: A Study of Teaching and Learning. NCTE research report no. 22. [ Google Scholar ]
  • Lawson A. E. Using the learning cycle to teach biology concepts and reasoning patterns. J. Biol. Educ. 2001; 35 (4):165–169. [ Google Scholar ]
  • Malcom S. M., Abdallah J., Chubin D. E., Grogan K. A System of Solutions: Every School, Every Student. Washington, DC: American Association for the Advancement of Science; 2005. [ Google Scholar ]
  • Marzano R. J. Fostering thinking across the curriculum through knowledge restructuring. J. Reading. 1991; 34 (7):518–525. [ Google Scholar ]
  • National Academy of Sciences, National Academy of Engineering, Institute of Medicine. Washington, DC: Committee on Prospering in the Global Economy of the 21st Century; 2005. Rising Above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future. [ Google Scholar ]
  • National Research Council. Washington, DC: National Academy Press; 1995. National Science Education Standards. [ Google Scholar ]
  • National Research Council. Washington, DC: Committee on Undergraduate Biology Education to Prepare Research Scientists for the 21st Century; 2003. Bio 2010, Transforming Undergraduate Education for Future Research Biologists. [ Google Scholar ]
  • National Science Foundation. Washington, DC: Directorate for Education and Human Resources; 1996. Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering, and Technology. [ Google Scholar ]
  • Office of Educational Research Improvement. Washington, DC: 1991. Striving for excellence: The National Education Goals. [ Google Scholar ]
  • Project Kaleidoscope. Washington, DC: National Science Foundation; 2006. Transforming America's Scientific and Technological Infrastructure: Recommendations for Urgent Action. [ Google Scholar ]
  • Resnick L. B. Education and Learning To Think. Washington DC: National Academy Press; 1987. [ Google Scholar ]
  • Rivard L. P. A review of writing to learn in science: implications for practice and research. J. Res. Sci. Teach. 1994; 31 (9):969–983. [ Google Scholar ]
  • Springer L., Donovan S. S., Stanne M. E. Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: a meta-analysis. Rev. Educ. Res. 1999; 69 (1):21–51. [ Google Scholar ]
  • Steglich C. S. A writing assignment that changes attitudes in biology classes. Am. Biol. Teach. 2000; 62 (2):98–101. [ Google Scholar ]
  • Taylor K. L., Sobota S. J. Writing in biology: an integration of disciplines. Am. Biol. Teach. 1998; 60 (5):350–353. [ Google Scholar ]
  • Tessier J. Writing assignment in a nonmajor introductory ecology class. J. Coll. Sci. Teach. 2006; 35 (4):25–29. [ Google Scholar ]
  • Tobin K. G., Tippins D. J., Gallard A. J. Research on instructional strategies for teaching science. In: Gabel D. L., editor. Handbook of Research on Science Teaching and Learning. New York: Macmillan; 1994. pp. 45–93. [ Google Scholar ]
  • Tsui L. ASHE annual meeting paper. Miami, FL: 1998. A review of research on critical thinking; pp. 5–8. 1998 November. [ Google Scholar ]
  • Tsui L. Fostering critical thinking through effective pedagogy: evidence from four institutional case studies. J. High. Educ. 2002; 73 (6):740–763. [ Google Scholar ]
  • U.S. Department of Education. Washington, DC: 1990. National Goals for Education. [ Google Scholar ]
  • Watson G., Glaser E. M. Watson-Glaser Critical Thinking Appraisal. Cleveland, OH: The Psychological Corporation (Harcourt Brace Jovanovich); 1980. [ Google Scholar ]

Craig B. Barkacs MBA, JD

Critical Thinking Is All About “Connecting the Dots”

Why memory is the missing piece in teaching critical thinking..

Updated July 17, 2024 | Reviewed by Monica Vilhauer

  • Critical thinking requires us to simultaneously analyze and interpret different pieces of information.
  • To effectively interpret information, one must first be able to remember it.
  • With technology reducing our memory skills, we must work on strengthening them.

I have a couple of questions for my regular (or semi-regular) readers, touching on a topic I’ve discussed many times on this blog. When it comes to power, persuasion , and influence, why is critical thinking so crucial? Alternatively, what are some common traps and pitfalls for those who prioritize critical thinking? It's not necessary that you go in to great detail—just any vague or general information that comes to mind will do.

Great! Regardless of whether you recalled anything specific, the key is you made the effort to remember something. Like many questions I pose here, the real purpose is to illustrate a point. If you aim to be influential and persuasive—i.e., successful—in both work and life, you must be proficient in critical thinking. To achieve this proficiency, you need to cultivate and exercise your memory , a skill that is increasingly at risk in a technology-saturated age.

Remembering Is the Foundation of Knowing

Learning and remembering something are often discussed as if they are two separate processes, but they are inextricably linked . Consider this: Everything you know now is something you once had to learn, from basic facts to complex knowledge and skills. Retaining this information as actual knowledge, rather than fleeting stimuli, depends entirely on memory. Without memory, there is no knowledge. Consequently, there can be no critical thinking, as it relies on prior knowledge, which in turn relies on memory.

Students sometimes tell me that they want to learn how to be good critical thinkers but complain about having to “memorize stuff.” On these occasions I will often say, in a playfully teasing manner, “What I hear you saying is that you're bothered by having to remember stuff.” This usually helps them see how silly and unreasonable it is to complain about memorizing information, as there isn’t a single course in existence that doesn’t require remembering something . The ability to remember is at the core of critical thinking, and I often use the simple visual demonstration that follows to illustrate this point.

Collecting Dots and Connecting the Dots

Benjamin Bloom, an educational psychologist, developed a model known as the “Taxonomy of Learning.” Originally intended for educational psychology, this model also highlights why memory is the foundation of critical thinking—or any kind of thinking at all.

Humans are creatures of interpretation, constantly processing the information we perceive. This ability has made us the scientists, inventors, and artists that we are today. To interpret information, however, we must first remember it—not all information, obviously, as that’s impossible. Thanks to technology (which we’ll get to momentarily) we have vast amounts of information potentially at our fingertips. But how do you know what information to look up in a given situation? To know where to start and avoid endlessly searching irrelevant data, you need to remember enough of the right kind of information.

Think of a crime movie where an investigator, while reviewing evidence, suddenly has an epiphany and rushes off to confirm their hunch. These scenes illustrate that while the investigator needs more information, they remember enough to know what to search for.

Here’s a visual demonstration I use in class to help my students understand. Imagine you have pieces of information represented as five dots:

Craig Barkacs

Now let’s say that any coherent shape or picture you can draw using these dots is an interpretation of the information. When examined together, what might these five dots mean? Here’s one way to connect the dots.

Craig Barkacs

What does this shape represent? Many people will quickly say it’s a house, a common and reasonable interpretation. But not everyone sees it as a house. Some might say it’s the home plate used in baseball. Even when people connect the dots (i.e., interpret a cluster of information) the same way using the same lines, they don’t necessarily interpret the picture the same way. The situation becomes more complex when people connect the dots differently, creating a completely different shape or picture.

Craig Barkacs

Now, having connected the dots differently, instead of a house, we have a star. Or at least some would consider it a star; others might say it’s an occult or magic symbol—these are all very different interpretations. This shows that with the same pieces of information, people can “connect the dots” differently, and even when they connect them the same way, they see different things.

Now what happens when additional information is added or an alleged “missing dot” is perceived by others?

Craig Barkacs

With just one additional dot, what could have previously been interpreted as a 5-pointed star can now be reasonably interpreted as the Star of David.

Finally, sometimes the additional information can lead to a completely different shape or image, resulting in a “eureka” moment of insight. What previously appeared as different types of stars now looks like a circle.

critical thinking learning outcomes writing

I use this classroom demonstration to illustrate how people can interpret the same objective information in highly subjective ways, creating different narratives for themselves and others. This is a crucial point to remember when aiming to influence or persuade others—i.e., the need to see things from their perspective. Additionally, this activity powerfully underscores the importance of “collecting dots”—that is, the importance of remembering crucial bits of information. Without enough such dots, you lack the basic information needed to form meaningful ideas. Without meaningful ideas, you can’t think critically, influence, or persuade. It’s as straightforward as that.

Memory in the Age of Omnipresent Technology

Why is it so crucial to recognize that memory is foundational to critical thinking, power, influence, and persuasion? Partly because this fact isn’t widely acknowledged—and it needs to be. Additionally, we live in an era where memory is under unprecedented assault. While technology allows us to achieve remarkable feats unimaginable to previous generations, it comes at a cost. One such cost is “digital-induced amnesia,” where our memory capabilities atrophy due to information overload and technology taking over many of the cognitive tasks we used to perform ourselves.

Memory doesn’t exist in isolation. It’s closely tied to traits like the ability to focus and pay attention . If you’re not paying attention, you can’t absorb the information that you want or need to remember. Unfortunately, technology also impacts our ability to focus , and this doesn’t even touch on the dramatic ways AI ’s explosive development might undermine our thinking skills .

This article won’t delve into specifics on improving focus and memory in an age of tech ubiquity. Fortunately, resources from Psychology Today can help with that. My goal here is to convince you why memory is so vital for anyone who wishes to be a critical thinker and a persuasive, influential person. Now you know. Whether you’ll remember or not...only time will tell.

Craig B. Barkacs MBA, JD

Craig Barkacs, MBA, JD, is a professor of business law at the University of San Diego School of Business and a trial lawyer with three decades of experience as an attorney in high-profile cases.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • International
  • New Zealand
  • South Africa
  • Switzerland
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

July 2024 magazine cover

Sticking up for yourself is no easy task. But there are concrete skills you can use to hone your assertiveness and advocate for yourself.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

ORIGINAL RESEARCH article

Analysis of the contribution of critical thinking and psychological well-being to academic performance.

Miguel H. Guamanga

  • 1 Faculty of Human Sciences, Universidad Icesi, Cali, Colombia
  • 2 Department of Basic Psychology, Psychobiology and Methodology of Behavioral Sciences, University of Salamanca, Salamanca, Spain
  • 3 Psychology Research Centre (CIPsi/UM), Department of Basic Psychology, School of Psychology, University of Minho, Braga, Portugal

This study examines the influence of critical thinking and psychological well-being on the academic performance of first-year college students. It emphasizes the importance of a model of psychological well-being focused on self-acceptance, environmental mastery and purpose in life, along with a critical thinking approach oriented to problem solving and decision making. A total of 128 first-year psychology students from a Spanish public university participated, assessed by means of Ryff’s psychological well-being scale (PWBS) and the PENCRISAL critical thinking test, complemented with grades obtained in a critical thinking course. The results show positive correlations between psychological well-being, critical thinking and academic performance, with a stronger relationship between critical thinking and academic performance. However, psychological well-being also plays a significant role in academic performance. The findings highlight the need for holistic pedagogical approaches that combine cognitive skills and personal development to enhance first-year students’ learning.

1 Introduction

In the context of the increasing demands of contemporary societies, in this study we address how critical thinking (CT) and psychological well-being (PWB) influence academic performance within the university setting. Upon entering university, first-year students are faced with the challenge of adapting to new academic dynamics and demands, which they must balance with the pursuit of personal satisfaction ( Acee et al., 2012 ; Casanova et al., 2018 ). The adaptation process, which involves the achievement of academic goals and the projection of long-term life objectives, is fundamental to academic performance, considered a key indicator of successful adaptation and a reflection of the competencies required in the professional environment ( Alonso-Borrego and Romero-Medina, 2016 ; Frick and Maihaus, 2016 ).

The goal of this research is to show the link between CT, which is characterized by analyzing and evaluating information, making evidence-based inferences, and reflecting on one’s own thought process for decision making and problem solving ( Bailin et al., 1999 ; Ennis, 2015 ; Jahn and Kenner, 2018 ; Saiz, 2020 ; Halpern and Dunn, 2023 ), and the PWB, which focuses on personal development ( Ryff, 1989 , 2013 ; Ryff and Keyes, 1995 ); and analyze how both contribute to academic performance. Despite the complexity of the factors that can influence academic performance, in this study we want to combine cognitive and socio-affective variables to better understand these dynamics. Based on The Ryff Psychological Well-Being Scale (PWBS), we examine how well-being, especially through self-acceptance, environmental mastery, and purpose in life impacts academic performance. As a starting point we recognize that CT may have an even greater effect on academic performance. This holistic approach seeks to contribute to the debate on the competencies needed for the 21st century through the relevance of CT and PWB in university education and their role in the formation of individuals capable of coping with contemporary demands.

1.1 Contextualization and characterization of academic performance

In the university context, academic performance is influenced by a series of factors ranging from pedagogical practices and student satisfaction with them to more personal and intrinsic elements. These include the student’s motivation and emotional state, academic background, IQ, personality traits and level of psychological maturity. This multi-layered approach focuses the complexity underlying academic performance and emphasizes the interaction between the educational environment and the individual qualities of each student.

A study by Oliván Blázquez et al. (2019) highlights the flipped classroom (FC) method in comparison to traditional lecture-based learning (LB) and shows that FC not only improves students’ grades, but also maintains their satisfaction with learning without increasing their perceived workload. Although FC was initially perceived as more difficult, this did not have a negative impact on satisfaction or long-term learning, underscoring the importance of student perceptions and involvement in the learning process. These results support the introduction of FC in higher education and point to the need for continuous adjustments based on student feedback to maximize academic performance and develop critical and practical skills.

Beyond educational practices, Gilar-Corbi et al. (2020) investigated how motivational and emotional factors and prior academic performance influence college students’ success. The study used the Motivated Strategies Learning Questionnaire (MSLQ) and the Trait Meta-Mood Scale (TMMS) to measure motivational learning strategies and emotional intelligence. The findings show that scores obtained in the diagnostic tests have a strong influence on academic performance, while emotional attention has a minor influence. The study points out that prior performance, together with self-efficacy and appropriate emotional regulation, plays a crucial role in predicting academic success. Thus, the authors suggest that interventions focused on improving self-efficacy and emotional intelligence may be key to optimizing students’ academic outcomes.

In the same context, this time with more variables, Morales-Vives et al. (2020) investigate the influence of intelligence, psychological maturity and personality traits on the academic performance of adolescents, and find that these factors combined explain about 30% of their variability. Intelligence, especially in reasoning and numerical aptitude, emerges as the most significant predictor, while psychological maturity, reflected in work orientation, and traits such as conscientiousness and openness to experience, have an indirect influence. These findings show that, although intelligence plays a decisive role, maturity and personality are in a lesser proportion.

These conclusions and the recommendations derived from them resemble recent advances in academic research. One example is the work of Mammadov (2022) , which draws attention to cognitive ability as the main predictor of academic performance, but also points to the relevance of conscientiousness, a personality trait associated with self-discipline and organization, which explains a significant part of the variability in academic performance. Mammadov also suggests that the influence of personality on performance varies by educational level, showing the dynamics between a student’s personality and his or her educational context. These findings demonstrate the need for educational strategies that promote both cognitive development and the reinforcement of positive personality traits.

Recent research on academic performance shows two consensuses. First, there is a growing understanding of the influence of the interaction between intrinsic and extrinsic factors, including pedagogical methods and motivational, emotional and cognitive elements, in improving the performance and satisfaction of students in higher education. The studies reviewed highlight the relevance of cognitive ability and personality traits such as consciousness, and promote a holistic educational approach that integrates the development of cognitive and personality dimensions. Second, academic achievement is recognized as a multidimensional construct, objectively assessed through quantitative indicators such as grade point average (GPA) and standardized assessment scores. These reflect the attainment of educational objectives and the accumulation of knowledge and skills over time.

1.2 Contextualization and characterization of critical thinking

Halpern (1998 ) argues that intrinsic effort and a willingness to analyze and solve complex problems are key competencies for learning and adapting to a constantly changing environment. According to Halpern (1998) CT transcends the mere acquisition of analytical skills and requires the development of an active predisposition to question assumptions, consider diverse perspectives, and persist in cognitive effort. This disposition is by no means innate, but can be cultivated through a pedagogy that explicitly integrates the teaching of critical skills such as logical analysis, argument evaluation, and information synthesis, and that emphasizes problem structuring to facilitate skill transfer and metacognitive self-regulation. Halpern proposes an educational framework that promotes the acquisition of these skills and encourages reflection on the thinking process so that students are able to apply CT effectively in diverse contexts and continuously improve. This methodical and structured approach characterizes CT as a set of advanced cognitive skills and an exercise of conscious judgment that is essential for informed, evidence-based decision making, which integrates non-cognitive elements ( Halpern and Dunn, 2023 ).

Throughout the development of the discourse on CT, various theories and their empirical foundations have evolved into meaningful educational practices, recognized in diverse academic settings. Meta-analyses, particularly those by Abrami et al. (2008 , 2015) have contributed significantly to the understanding of effective teaching of CT and have emphasized the need for specific and tailored teaching strategies that incorporate clear CT objectives into educational programs. These studies demonstrate that CT, defined as a process of intentional, self-regulated judgment that includes interpretation, analysis, evaluation, and inference, is increasingly recognized as essential in the knowledge era. Abrami et al. (2008) note that critical skills and dispositions are developed through explicit pedagogical interventions, as opposed to spontaneous acquisition, which challenges traditional pedagogical paradigms and fosters a shift towards intentional educational practices, placing students at the center of learning.

In addition, a more detailed analysis by Abrami et al. (2015) identifies that strategies that encourage interactive dialogue, confrontation with real problems, and individual tutorials are particularly effective. This suggests that active and meaningful learning outperforms traditional methods in the development of critical skills. This approach not only enhances students’ analytical and synthesis skills, but also facilitates the transfer of knowledge to new contexts, a key skill for the 21st century. The research reinforces the view that CT is a cross-cutting competency, crucial for navigating the complexity of contemporary challenges, and argues for an education that integrates these skills into all areas of learning.

Despite in-depth analyses of the need for CT, the growing discrepancy between rapid progress, the availability of information and the ability to critically analyze it poses a major challenge. Dwyer et al. (2014) point out that the exponential increase in global information has outpaced the ability of traditional education systems to teach effective CT skills, creating a gap that may inadequately prepare students for the challenges of today’s world. The authors argue that the ability to critically evaluate, synthesize, and apply knowledge is crucial for academic success and survival in the 21st century. This approach highlights how CT, by fostering analytical and reflective skills, transcends academia to positively impact individual and collective well-being, and argues for educational strategies that bridge the gap between information acquisition and critical analytical skills.

Recent research on this topic points to the indisputable relevance of CT as an essential component of academic performance and points to its role as a key predictor of success in educational processes. Rivas et al. (2023) show that CT transcends conventional cognitive skills. This is because CT is characterized as a rigorous practice that fosters in-depth analysis, critical evaluation and synthesis of information oriented to decision making and problem solving, fundamental skills to understand and apply knowledge in complex contexts. Research shows that CT skills not only maintain a positive correlation with academic performance, but can be significantly improved through targeted educational programs. For this reason, the authors advocate their integration into curricula and educational assessment systems to prepare students for the challenges of the 21st century, especially when phenomena such as artificial intelligence acquire greater prominence in social and professional dynamics ( Saiz and Rivas, 2023 ).

The literature on CT identifies two fundamental consensuses: first, it defines it as an intentional and deep process, oriented to problem solving and decision making, based on meticulous analysis that goes beyond logical reasoning to include a critical evaluation of the basis for judgments. In addition, it involves detailed scrutiny and integration of new information in changing contexts, as well as metacognition, i.e., conscious self-regulation of thinking that facilitates adaptation and continuous improvement of cognitive strategies in accordance with the major demands and obstacles of our first half century ( Dwyer, 2023 ). In its practical application, CT enables daily challenges to be met through informed judgments and a willingness to question and adjust perspectives in response to new information. Characterized by curiosity and adaptability, CT is essential for making responsible decisions and achieving successful outcomes, underscoring its practical value in both personal and professional settings.

Second, CT, beyond its theoretical value, can be conceived as a key theory of action for academic performance and PWB ( Saiz, 2020 ; Saiz and Rivas, 2023 ), by enhancing in individuals the ability to face and solve problems in an effective and grounded manner. CT involves crucial skills such as analysis, evaluation and synthesis, indispensable for acquiring and retaining knowledge, and also for applying it in new contexts, which improves academic performance and has, in principle, positive effects on quality of life. Thus, CT emerges as an academic competence and an essential tool for everyday life ( Dumitru and Halpern, 2023 ; Guamanga et al., 2023 ). Therefore, to synthesize theoretical paths with a practical function, we understand that “to think critically is to arrive at the best explanation of a fact, phenomenon or problem in order to know how to solve it effectively” ( Saiz, 2024 , p. 19).

1.3 Contextualization and characterization of psychological well-being

The task of relating concepts that are difficult to operationalize, such as well-being, is a major challenge; but it is necessary to approach it, more within a framework of CT understood as a means to achieve broad objectives than as an end in itself. Thinking critically transcends the mere application of skills or the accumulation of goal-oriented knowledge. In fact, it requires a detailed examination of the effect that such management has on the environment and how the satisfaction derived from reaching certain achievements is related to subjective aspects.

CT by its very deliberative and goal-oriented nature goes beyond the search for how to reach effective solutions and addresses a wider range of human and social consequences resulting from these actions ( Facione, 1990 ; Elder, 1997 ; Jahn, 2019 ). The idea is to involve non-cognitive aspects that occupy a central place in academia, and that are crucial in the interaction between specific knowledge and skills, elements widely explored in the discourse of CT. In this sense, PWB has been selected as the focus of study, recognizing it as a desirable attribute in educational processes. The challenges this poses are not lost sight of, especially when it comes to quantifying transient, subjective and normatively mediated judgments about what states or conditions are considered good, healthy or desirable in the complexity of human experience, as detailed by Flanagan et al. (2023) .

Ryff (1989 , 2013) , Ryff and Keyes (1995) contribution to the conceptual understanding and dissemination of PWB is notorious and highly valued in different fields of knowledge ( Van Dierendonck and Lam, 2023 ). The imprint of his research has been marked by criticism of a reductionist conception of PWB that simplifies well-being to the presence of positive affective states ( Ryff, 1989 ). Consequently, Ryff defends a much more complex multidimensional concept that seeks to attune the attainment of goals with the development of potentialities. Ryff’s thesis is that PWB is a multidimensional construct that transcends happiness or mere life satisfaction ( Ryff and Keyes, 1995 ).

Carol Ryff’s theory of PWB, based on humanistic, clinical and developmental psychology, as well as Aristotelian eudaimonia, focuses on self-actualization, the search for meaning and purpose in life as the core of well-being. As detailed in the text Happiness is everything, or is it? Explorations on the meaning of psychological well-being ( Ryff, 1989 ) the model consists of six dimensions that converge in personal development: autonomy, environmental mastery, personal growth, positive relations with others, purpose in life, and self-acceptance.

The first dimension, self-acceptance, implies a positive attitude towards oneself and an acceptance of all aspects of one’s identity, including both positive and negative qualities. As for positive relationships with others, Ryff states that these are interpersonal relationships characterized by warmth, trust and genuine concern for the well-being of others; this dimension is dominated by the value of empathy in human well-being. Autonomy is defined by an individual’s capacity to maintain independence and resist social pressures in order to regulate their behavior according to internal personal norms. This dimension emphasizes self-determination as a compass for the pursuit of well-being. On the other hand, environmental mastery emphasizes the ability to effectively manage and control the external environment, which implies a feeling of competence and control over personal and professional life. Finally, purpose in life and personal growth refer to the possession of goals, direction and a sense of development and fulfillment of one’s potential. These dimensions reflect the search for meaning and continuous personal evolution as fundamental components of PWB.

Ryff’s PWBS has established itself as a key instrument in positive psychology. Research after 1989 ( Ryff and Keyes, 1995 ; Ryff, 2013 ) has explored the variability of these dimensions with age and across genders. These studies showed the influence of sociodemographic factors on well-being, so the model has been extended to consider the development of PWB across the lifespan and determined by more contextual factors such as health. The approach enriches the understanding of PWB and denotes the practical relevance of the construct in fields such as mental health and social policy. Ryff’s work has inspired other researchers to discuss and extend its principles ( Van Dierendonck and Lam, 2023 ). For example, Huppert (2009) complements Ryff’s dimensions by emphasizing the management of negative emotions and resilience as key components of sustainable well-being; Huppert aligns this view with the World Health Organization (WHO) definition of health and adds a dynamic dimension on overcoming adversity. This theoretical and practical deepening demonstrates the robustness and adaptability of Ryff’s model. The synthesis of these contributions confirms the value and applicability of Ryff’s PWBS; they reveal how the eudaemonic model not only reinforces an academic discourse, but also guides practices that promote well-being in different contexts and consolidates itself as a vital field in human development.

However, due to the same complexity and extension of the PWB construct, Ryff’s PWBS has different observations that question its theoretical and statistical foundations. On the first aspect, the work of Disabato et al. (2016) , by examining the distinction between hedonic and eudaimonic well-being, problematizes the theoretical basis of this dichotomy. Through an analysis incorporating data from 7,617 individuals from 109 countries, the authors find that there is no clear distinction between hedonic well-being experiences, focused on pleasure, and eudaimonic ones, related to personal fulfillment. The results indicate a high correlation between the two types of well-being ( r  = 0.96). This suggests that people do not significantly differentiate between pleasure seeking and self-fulfillment in their perception of well-being. This implies that the hedonic-eudaimonic dichotomy may not hold empirically and, therefore, a unified model of well-being that reflects the current behavioral dynamics should be sought.

From a statistical perspective, Ryff and Keyes (1995) analyses show that the PWBS, composed of 18 items, meets psychometric criteria and shows strong internal and moderate correlations among different scales. Correlations between dimensions range from low to modest (0.13 to 0.46), suggesting that each dimension addresses unique aspects of well-being. From the theoretical model, this diversity underscores that, although interrelated, the dimensions represent unique aspects of psychological well-being. In terms of specific results, studies indicate that with age the dimensions of environmental mastery and autonomy increase, while purpose in life and personal growth tend to decrease, with no significant changes in self-acceptance and positive relationships with others. Women outperform men on positive relationships with others and personal growth, suggesting that changes in these dimensions reflect evolving priorities and perceptions of personal development across the life span ( Ryff and Keyes, 1995 ).

On the number of dimensions of PWBS, Blasco-Belled and Alsinet (2022) note that the six-dimensional theoretical model has generated debate even among experts in the field. Some suggest that a four-dimensional model-environmental mastery, personal growth, purpose in life, and self-acceptance-might represent a second-order PWB factor, indicating a possible conceptual overlap between Ryff’s original dimensions; others exclude positive relationships with others and autonomy from the model. The study of Ryff’s PWBS by network analysis conducted by Blasco-Belled and Alsinet (2022) shows four different dimensions, in one of these, the most important node of the network, self-acceptance, purpose in life and environmental mastery are grouped, with special emphasis on self-acceptance because of its centrality in the network at the item level.

In the Spanish-speaking context, Nogueira et al. (2023) identified three main factors: autonomy, positive relationships with others, and competence. This suggests that PWBS may vary according to cultural and contextual factors. Furthermore, although it is not a study analyzing the dimensions of Ryff’s PWBS, the study by Páez-Gallego et al. (2020) applied the PWBS to Spanish adolescent students and found a strong positive correlation with the use of adaptive decision-making strategies. Specifically, the findings show that the adaptive approach is significantly associated with improvements in self-acceptance, environmental mastery, and purpose in life. In contrast, maladaptive strategies characterized by impulsivity and avoidance are associated with lower PWB. From this we infer that fostering effective decision-making skills is important for well-being and, in particular, we identify from empirical studies the dimensions of PWBS that correlate with post decisional skills.

Taken together, these findings suggest that Ryff’s PWBS, although pioneering and widely used, could benefit from revision to more accurately reflect the structure of PWB and its application in diverse cultural and educational contexts. The convergence of evidence from factorial and network analysis perspectives points to the need for a more integrated and adaptive model capable of capturing the complexity and dynamics of the underlying constructs. This underscores the continuing interest in PWB in research and practice. It is also an indication of the ongoing scholarly debate about its conceptualization and measurement. The recurrence of dimensions such as self-acceptance, environmental mastery, and purpose in life across analyses suggests a common core of PWB. This raises the question of whether these dimensions can be conceptually aligned with academic achievement and CT. In addition, questioning the boundaries between hedonic and eudaimonic raises the issue of whether a broader construct is needed to analyze well-being in educational settings. In this context, we start from the premise that self-acceptance, environmental mastery, and purpose in life are sufficient to explore college students’ PWB. These dimensions reflect students’ ability to recognize their strengths and weaknesses, set goals, and navigate effectively in their educational environment, aspects that could be considered part of the dispositional component necessary for the development of higher-level competencies such as those of the CT.

The research brings to empirical analysis the complex interplay between CT, PWB, and academic performance in the university context. We seek to answer how CT skills and PWB influence college students’ academic performance; and, how CT practices can be aligned with PWB to improve academic performance. We propose that the study variables converge in both a theoretical and an empirical model. The argumentative strategy consists of analyzing the direct impact of CT on academic performance, assessing whether PWB correlates with better academic outcomes, examining in detail the predictive factor of the relationship between CT and PWB on academic performance, and finally, according to the data obtained, proposing some dialogic bridges between cognitive and non-cognitive aspects of CT.

2 Methodology

2.1 participants.

The study involved 128 first-year psychology students from a Spanish public university. The vast majority were women (83.1%), with only 16.9% men, which is usual in social sciences and humanities degrees. Age ranged from 18 to 33 years, with a mean of 19.28 (SD = 1.73). The sample was essentially composed of students who had completed secondary education (75.3% of the students were 19 years old). Between the ages of the students according to sex — females ( M  = 19.09, SD = 0.814) and males ( M  = 20.20, SD = 3.78) — there were no statistical differences, but the age of the males was not only higher, but also more dispersed.

2.2 Instruments

The instruments applied were Ryff’s PWBS in its Spanish adaptation ( Díaz et al., 2006 ) and the PENCRISAL critical thinking test ( Saiz and Rivas, 2008 ; Rivas and Saiz, 2012 ). For academic performance, the academic records of the students participating in the critical thinking course in the first year of the psychology graduation were collected. The grades have an ascending interval from 1 to 10.

Ryff’s PWBS as mentioned in the previous discussion has different models. This instrument aims to measure psychological well-being, focusing on students’ own evaluations of their situations and perceived success in various aspects of life and personal development. It explores well-being through six main dimensions, self-acceptance (α: 0.83), positive relationships with others (α: 0.81), environmental mastery (α: 0.71), autonomy (α: 0.73), purpose in life (α: 0.83) and personal growth (α: 0.68). The questionnaire consists of 39 items, presented in a Likert scale format ranging from 1 (strongly disagree) to 6 (strongly agree) ( Díaz et al., 2006 ).

Consistent with the complexity of the scale and some data in common with other studies, we have chosen to consider only self-acceptance, environmental mastery and purpose in life. In support of this methodological decision, we have performed with our sample an exploratory factor analysis (principal components method) to see if these three dimensions converge in the same factor. The data confirm this convergence and show that this single factor has an eigenvalue of 2.43 and explains a very high value of the variance of its results (81.1%).

In the case of the PENCRISAL, the full version was applied, and the score was taken for each of the five dimensions and the total score. The PENCRISAL was applied to measure CT skills. This test consists of 35 problem situations that participants answer in an open-response format. The test is organized into five key areas: deductive reasoning, inductive reasoning, practical reasoning, decision making and problem solving.

The deductive and inductive component tests different forms of reasoning, such as propositional, categorical, causal, analogical and hypothetical. Decision-making measures the ability to make probabilistic judgments and to effectively use heuristics to identify potential biases. The problem-solving section poses participants with general and specific problems that require appropriate solution strategies. These sections are intended to encourage the application of strategies necessary for effective problem planning. The open-ended question format encourages participants to justify their answers, which are evaluated using a scoring system that rates the quality of their responses on a scale of 0 to 2. Responses are converted into numerical scores using item-specific criteria. These are used to describe and identify the thinking mechanisms underlying each response. A score of 0 indicates that the answer is incorrect, 1 indicates that the answer is correct but no or inadequate justification is provided, and 2 indicates that the answer is correct and adequate justification is provided. The PENCRISAL yields an overall score of the CT ranging between 0 and 70 and between 0 and 14 for each dimension. Reliability assessments show satisfactory accuracy, with a minimum Cronbach’s Alpha of 0.632 and a test–retest reliability of 0.786 ( Rivas and Saiz, 2012 ). The test is administered online through the SelectSurvey.NET V5 platform.

2.3 Procedures

Students gave their free and informed consent to participate in the study. The PWBS was carried out at the beginning of the semester of the CT course. The PENCRISAL test was taken at the beginning and at the end of the academic period. Only the results of students who completed both instruments are considered. Academic performance is represented by the grade obtained by students at the end of the course. Statistical analyses were performed with IBM/SPSS version 29.0. After performing the descriptive statistics, we proceeded to a correlation analysis and, finally, we evaluated the impact of the PWBS and the CT on the variance of academic performance by performing a regression analysis.

Table 1 presents the descriptive data of the students’ scores on the two instruments applied, and the measure of academic performance. In addition to the minimum and maximum values, the mean, standard deviation and indicators of skewness and kurtosis of the distribution of the results are presented.

www.frontiersin.org

Table 1 . Descriptive statistics for the measures used ( n  = 128).

Observing the results, we can see a distribution with a slight tendency towards values above the mean (m = 79.80) for the PWBS, which is reflected in a negative skewness (−0.437). With respect to the five dimensions of CT, it can be stated globally that the mean value of DR, IR and PS is moving away from the maximum value observed and towards the minimum value, which represents a positive symmetry. The opposite situation occurs with the RP dimension. Regarding the TCT, the data show a tendency to scores around the mean (m = 37.21), as can be deduced from the residual values of skewness and kurtosis. Regarding the AP, the data suggest a balanced distribution of academic scores around an intermediate value between 3.66 and 9.01 as scores at the lower and upper extremes (m = 6.10), with very low skewness and kurtosis.

In general, the results show good variability or dispersion, since the mean of each variable is located in the center of the data interval, which is desirable in research to adequately represent the population studied. Skewness and kurtosis indices close to zero for academic achievement are especially indicative of a normal or Gaussian distribution of values. The slightly higher kurtosis in the IR dimension of CT (2.248) is still acceptable.

Table 2 shows the correlations between the variables in this study. Since these were interval metric variables, Pearson’s product x moment method was used to calculate the correlations. For statistical significance, the two-tailed test was used and p  < 0.05 was set as the limit of significance.

www.frontiersin.org

Table 2 . Correlations between study variables.

According to the data, the highest correlation is found between TCT and AP, with the lowest correlation being between CT and PWBS measurement (no correlation). At an intermediate level is the correlation between PWBS and AP. Likewise, all the dimensions of the CT correlate with the AP with values between 0.183 (PS) and 0.337 (PR). As can be seen, there are variations in the correlations among the five dimensions of the CT, but all have high correlations with the total score (between 0.502 and 0.668). In this sense, only the TCT score is used for the regression statistical analysis.

In summary, the data suggest that there is a significant and positive relationship between PWBS and AP, as well as an even stronger and more significant relationship between TCT and AP. There is no evidence of a significant relationship between PWBS and TCT. To further explore the relationships between cognitive and noncognitive variables in AP, we turned to a regression analysis. We opted for a linear regression with PWBS and TCT as predictors and AP as the criterion or dependent variable. Table 3 presents the regression values obtained.

www.frontiersin.org

Table 3 . Impact of psychological well-being and critical thinking on academic performance.

The regression model was found to be statistically significant, with an F -value (2, 88) = 18.571, p  < 0.001. This indicates that, collectively, PWBS and TCT provide significant prediction of AP. The coefficient of determination (R 2 adj.) is 0.285, which means that approximately 30% of the variability in AP can be explained by the independent variables in the model. As can be seen from the t -values and significance, both variables have a significant impact on AP, although TCT has a greater impact.

In a complementary manner, with the objective of enriching the analysis of the influence of the CT on the PA, we have included additional measures to the grade obtained by the students in the course (NCT), such as the selectivity grade with which they entered the university (NEBAU), the average grade of the transcript (NMEXP), that is, the grades of the other courses that the students must take, and the pretest results obtained with the PENCRISAL (PCT). The data obtained are recorded in Table 4 .

www.frontiersin.org

Table 4 . Correlations between study variables and complementary measures.

Table 4 shows that the relationship between PWBS and NMEXP has a Pearson correlation of 0.075, with a p -value of 0.372. This low correlation indicates that the connection is minimal. In contrast, the relationship between TCT and NMEXP shows a stronger correlation of 0.464**, suggesting a moderate positive association. The significance of this correlation, less than 0.001, indicates a statistically significant relationship, which implies that this result is not likely to be a coincidence. A similar case occurs with the relationship between NEBAU and NMEXP.

Given this context, if we perform a multiple linear regression analysis with NMEXP as the dependent variable and PWBS and TCT as independent variables, we would expect TCT to have a more significant impact on NMEXP. This projection is based on the statistically significant correlation of these variables. On the other hand, NEBAU has a slightly lower correlation with NMEXP compared to TCT (0.455 vs. 0.464), but the difference is very small, indicating that both have similar impact capacity for NMEXP in terms of linear correlation.

Confirmation of these hypotheses by appropriate regression analysis will provide a more detailed and accurate understanding of how PWBS and TCT individually contribute to the prediction of NMEXP, considering the influence of interrelated variables. However, in performing this procedure, a reduction in sample size to only 64 cases were observed. This increases the risk of failing to detect significant differences or could lead to unstable effect estimates.

4 Discussion and conclusions

The CT seeks to understand and effectively solve problems, through a correct approach, the generation of solution alternatives filtered by the mechanism of explanation and the selection of a solution, all with the aim of achieving a desired change. The PENCRISAL test is based on this defining framework of the CT ( Saiz and Rivas, 2008 ; Rivas and Saiz, 2012 ). Therefore, if we start from this concept and look at the data, we can conclude that the CT is a good predictor of academic performance.

Table 2 shows a positive and moderate correlation (0.514) between the CT and academic performance, suggesting that an increase in the CT is associated with an improvement in academic performance. Meanwhile, Table 3 shows — with a B coefficient of 0.074 and a Beta of 0.473 — that CT has a stronger relationship with academic performance compared to PWBS. This means that for every unit increase in CT, academic performance increases on average 0.074 units, and this effect is considerably significant in the model. The robust correlation and the impact indicated as a dependent variable highlight that the CT is a determinant competence of academic performance and is suggested as a relevant diagnostic and formative tool in the educational field. Although it is not the only factor that influences academic performance, the CT is presented as a significant predictor and one that can be worked on or trained in the classroom.

Declaratively, the current study coincides with other results obtained and recorded in Rivas et al. (2023) . On that occasion, the authors found that CT is a predictor of academic performance and that the benefits of instruction can be sustained over time. The study showed a correlation between CT and academic performance of 0.32. The main difference between these two studies concerns the objectives. The previous study did not attend to the explicit discussion of how CT could influence well-being, or vice versa. The current work recovers this line and incorporates non-cognitive variables in the analysis framework to account for well-being, under the assumption that this construct should have a significant impact on academic performance.

More generally, if we consider that, although the construct intelligence is not the same as CT, they do have several points of convergence ( Butler et al., 2017 ), then we can establish a dialogue with other studies on the factors that influence academic performance. Intelligence represents the intrinsic capacity to learn, understand, reason, and meet challenges through problem solving to adapt to the environment ( Sternberg, 1985 ). This cognitive potentiality manifests itself in various ways, being the CT one of its most relevant expressions, particularly in situations that demand deep analysis, evaluation, and decisions based on logical reasoning ( Saiz, 2024 ). The CT, therefore, acts as an essential tool that intelligence employs to effectively navigate through complex and challenging real-world situations ( Halpern and Butler, 2018 ).

In this conceptual line, the current results partially coincide with studies that have shown that the best predictors of academic performance are cognitive components, such as measures of general intelligence, analogical reasoning, fluid intelligence, logical, verbal and quantitative reasoning ( Morales-Vives et al., 2020 ; Mammadov, 2022 ); as well as scores on the diagnostic and university entrance test ( Gilar-Corbi et al., 2020 ).

In our study the other factor of analysis was the PWB. Although due to its non-cognitive nature it would be per se at a disadvantage compared to cognitive factors, the data also show that its inclusion in educational research, especially to account for academic performance, is significant. In Table 2 , the analysis of the correlation between PWBS and academic performance reveals a positive relationship with a correlation coefficient of 0.336. Although the correlation is moderate and not as strong as that observed between CT and academic performance, it is still significant and should not be ignored in the pursuit of improving students’ academic performance. Table 3 shows that PWBS has a positive and significant influence on the dependent variable. The standardized coefficient (Beta) of 0.271 indicates that there is a positive relationship between PWBS and academic achievement. The unstandardized coefficient (B) shows that, holding all other variables constant, for each unit increase in PWBS, academic performance increases on average 0.022 units. This relationship, supported by a low standard error of 0.007, points to a moderate but significant contribution of PWBS compared to other variables.

These findings show that the integration of some aspects of PWBS could be an effective strategy to improve academic performance, evidencing a beneficial and significant relationship between both aspects. PWB can influence academic performance through non-cognitive conditions or factors involved in learning, such as motivation, academic satisfaction, effective coping with stress or anxiety, and the acceptance and management of limitations related to the process of appropriation and adaptation to one’s own identity.

However, it is important to emphasize that the PWB is a construct that requires careful theoretical and empirical review in the educational context, as the Ryff scale has open debates and the lack of uniqueness of criteria on the number of dimensions influences these results. To cite just one case, we have used three dimensions out of six, with statistical and literature support, but the data may be different with a different selection approach. This finding highlights the importance of students’ PWB as part of a comprehensive educational strategy, but also shows that the direct impact of PWB on academic performance may be less pronounced than the impact of cognitive skills, and that due to its very multidimensional and complex nature, it is not easy to converge in an instructional design. Despite this, higher education institutions can take care of the institutional and relational climate so that students feel good and take advantage of the formative and educational opportunities of the academic environment. In the case of CT, there are concrete and validated training strategies that make it possible to improve skills such as argumentation, explanation, problem solving and decision making ( Guamanga et al., 2023 ; Saiz, 2024 ). On the PWB side, the same cannot be said due to the lack of empirical support; however, some studies have proposed a path that incorporates socio-emotional competences in the training of CT, a proposal characterized by the cognitive-emotional methodology, with interesting results that still need to be explored and debated ( Hanna, 2013 ).

Table 2 shows low and non-significant correlations between PWBS and the different forms of reasoning (deductive, inductive and practical), as well as with decision making and problem solving. For example, the correlation between PWBS and deductive reasoning is −0.082, which is not only low, but also lacks statistical significance. Additionally, the correlation between PWBS and decision making is −0.132, which is also a low correlation and not significant. Although there is a positive correlation between PWBS and problem solving (0.040), it is very low and not statistically significant, so there is not enough evidence to claim a positive relationship between these variables. This reinforces the idea that there is not a direct and significant relationship between how a student feels psychologically and CT skills or, nuanced is not supported by the data from this sample. It is possible that there are unexamined mediating factors that influence these relationships or that the relationship exists in a different context or with different measures.

The results of the present study do not coincide with other research that has shown positive relationships between decision-making and PWBS, especially with self-acceptance, environmental mastery, and purpose in life. The study by Páez-Gallego et al. (2020) addresses this issue by exploring how the PWBS of adolescents in Madrid, Spain, is linked to their decision-making methods. The research concludes that there is a positive correlation between the use of adaptive decision-making strategies and PWBS. Adolescents who opt for a rational and systematic evaluation of available options report higher levels of well-being. Specifically, adaptive decision-making style correlates significantly with overall well-being (0.544) and with aspects such as self-acceptance (0.485), positive relationships with others (0.242), environmental mastery (0.472), autonomy (0.359), purpose in life (0.473), and personal growth (0.346). In contrast, those who resort to maladaptive strategies, marked by impulsivity or avoidance, show reduced PWBS (−0.458).

The discrepancy in results with this study could be due to the difference between the instruments used to assess decision making. While Páez-Gallego et al. (2020) used the Flinders Adolescent Decision Making Questionnaire (FADMQ), which focuses on personal perceptions and experiences of decision making, our study uses the PENCRISAL, which although not limited to decision making, does include this ability as an essential component of the CT. The latter measures the ability to identify, analyze and solve everyday problems through items that simulate real situations, assessing the ability to choose the best solution or action strategy. Because the PENCRISAL responses are open-ended, it allows for a detailed assessment of how participants describe or explain their decisions. Ultimately, the fundamental difference between these two measures is that one is a self-report of perceptions and experiences, while the other is a set of problems to be solved correctly; in other words, one collects impressions of decision making and the other collects realized decision making. Therefore, although both studies applied Ryff’s PWBS, the differences between instruments and approach to decision making explain the variations in the results. This divergence evidences the relevance of considering the context and the specific instrument when interpreting the relationship between the PWBS and decision making.

Despite these findings, the need to further explore these interactions persists, especially given that the three selected dimensions-self-acceptance, environmental mastery, and life purpose-theoretically align with CT approaches focused on explanation and the development of post decisional skills, such as decision making and problem solving ( Guamanga et al., 2023 ). A CT approach that emphasizes the development of these skills must consider effects that transcend immediate or tangible outcomes. Therefore, it is crucial to understand how the concept of PWB, as examined above, relates to CT. Specifically, it must be determined whether some of these dimensions align directly to foster effective CT, or whether they instead lean more towards a conception of well-being in a more general sense, which could include hedonic aspects.

The emphasis on CT oriented to decision making and problem solving through the analysis of explanations and causalities should be evaluated for its pragmatic effects on PWB. At first glance this idea seems to confront parallel concepts paradoxically united by the same diachronic nature. In the case of the CT, this nature explains the high demands placed on it. For example, it is not enough to say that it contributes to tangible improvements in academic performance, but its usefulness is expected to transcend beyond academia and materialize in skills of interest to organizations in all sectors of the economy ( Casner-Lotto and Barrington, 2006 ; Atanasiu, 2021 ). However, their practical impact still presents serious challenges, especially when students, as active subjects of learning, face limitations in anticipating the usefulness and applicability of these critical skills for the future. This is partly explained by the fact that the educational system prioritizes academic performance over the comprehensive development required later in the professional sphere ( Saiz, 2020 ). Which means that the CT can be interpreted as an unfulfilled or partial promise. It is certainly a reading that omits the particular contexts, interests, motivations and concerns of students while they are part of these instructional programs and then the same factors analyzed by a student who knows that he or she must make the transition to the professional field.

A similar case happens with PWB as a diachronic phenomenon. An instant in time is not enough to understand and analyze students’ PWB. It is necessary to focus on how it changes and evolves through different stages, including through feelings of achievement or frustration in the academic process. Thus, it is recognized that PWB is not static and, therefore, evolves through lived experiences, among them, those comprising the applicability of a series of learned skills. This implies that as diachronic phenomena they can evolve and influence each other over time. This approach requires longitudinal studies to follow the evolution of the impact of curricular interventions aimed at strengthening cognitive skills such as those of the CT, in order to understand how these may influence the PWB in the long term.

The limitations of this study, beyond having a small sample that prevents the generalization of the results or having examined only certain dimensions of the PWBS, added to the theoretical impossibility of performing regression analyses with other performance measures, lie in the diachronic nature of the constructs studied. This characteristic makes it difficult, as has been argued, to give a definitive answer on the relationship.

Within the framework of the PWBS triad model we are analyzing, it is possible to theoretically group several key concepts. The development of the CT involves a process of self-acceptance, which is crucial given our inherent tendency for error. This process allows us, through a reflective evaluation of our past and present, to recognize and accept beliefs that we have discarded as erroneous. This self-acceptance facilitates deeper introspection, allowing us to see these errors as essential learning opportunities in our lives. On the other hand, any model that emphasizes post-decisional skills must also consider the non-linear complexity of our reality, and provide solid criteria for problem solving and decision making to master our environment more effectively. This is what allows us to adapt better, both biologically and socially. Finally, this approach to TC inevitably values purpose in life by seeking to ensure that it is in part determined by integrating the best tools of science, philosophy and education for a more effective life orientation, grounded in the principles of rationality. The importance of setting clear goals, recognizing that their achievement requires effort, discipline and determination, is essential to being an effective critical thinker.

Therefore, although each dimension proposed by Ryff’s PWBS possesses a conceptual richness that requires empirical validation, the dimensions selected for this study are aligned with a model of CT focused on problem solving and real-world decision making. Although we aspired to discover stronger links between PWB and CT, and to deepen their interrelationship, the theoretical parallelism analyzed is also reflected in the empirical results. Moreover, PWB as an operational concept, due to its complexity and multidimensionality, is subject to continuous revisions or possible unifications into a broader notion of well-being.

In future research on this topic, it is essential to include a broader set of variables predictive of academic performance. This includes, but is not limited to, students’ selectivity record and cumulative grades in other subjects. In addition, a more solid and theoretically robust concept of well-being must be adopted, one that fits contemporary educational and professional demands. This concept must transcend the simple distinction between eudaemonic and hedonic well-being, and address its diachronic nature. It is important to explore how these dimensions of well-being are interrelated, either as cause or effect; and to examine whether CT fosters a virtuous circle with well-being.

Data availability statement

The original contributions presented in the study are included in the article/supplementary material; further inquiries can be directed to the corresponding author.

Ethics statement

Written informed consent was obtained from the individual(s) for the publication of any potentially identifiable images or data included in this article.

Author contributions

MG: Writing – original draft, Conceptualization, Investigation, Methodology, Writing – review & editing. CS: Investigation, Methodology, Project administration, Validation, Writing – original draft. SR: Data curation, Investigation, Supervision, Validation, Writing – review & editing. LA: Formal analysis, Methodology, Validation, Writing – review & editing.

The author(s) declare that financial support was received for the research, authorship, and/or publication of this article. This study was funded by the Universidad de Salamanca, Spain.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Abrami, P. C., Bernard, R. M., Borokhovski, E., Waddington, D. I., Wade, C. A., and Persson, T. (2015). Strategies for teaching students to think critically. Rev. Educ. Res. 85, 275–314. doi: 10.3102/0034654314551063

Crossref Full Text | Google Scholar

Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M. A., Tamim, R., et al. (2008). Instructional interventions affecting critical thinking skills and dispositions: a stage 1 Meta-analysis. Rev. Educ. Res. 78, 1102–1134. doi: 10.3102/0034654308326084

Acee, T. W., Cho, Y. J., Kim, J.-I., and Weinstein, C. E. (2012). Relationships among properties of college students’ self-set academic goals and academic achievement. Educ. Psychol. 32, 681–698. doi: 10.1080/01443410.2012.712795

Alonso-Borrego, C., and Romero-Medina, A. (2016). Wage expectations for higher education students in Spain. Labour 30, 1–17. doi: 10.1111/labr.12072

Atanasiu, R. (2021). Critical thinking for managers: structured decision-making and persuasion in business. Management for professionals . Cham: Springer International Publishing.

Google Scholar

Bailin, S., Case, R., Coombs, J. R., and Daniels, L. B. (1999). Conceptualizing critical thinking. J. Curric. Stud. 31, 285–302. doi: 10.1080/002202799183133

Blasco-Belled, A., and Alsinet, C. (2022). The architecture of psychological well-being: a network analysis study of the Ryff psychological well-being scale. Scand. J. Psychol. 63, 199–207. doi: 10.1111/sjop.12795

PubMed Abstract | Crossref Full Text | Google Scholar

Butler, H. A., Pentoney, C., and Bong, M. P. (2017). Predicting real-world outcomes: critical thinking ability is a better predictor of life decisions than intelligence. Think. Skills Creat. 25, 38–46. doi: 10.1016/j.tsc.2017.06.005

Casanova, J. R., Cervero, A., Núñez, J. C., Almeida, L. S., and Bernardo, A. (2018). Factors that determine the persistence and dropout of university students. Psicothema 30, 408–414. doi: 10.7334/psicothema2018.155

Casner-Lotto, J., and Barrington, L. (2006). Are they really ready to work? Employers’ perspectives on the basic knowledge and applied skills of new entrants to the 21st century U.S. Workforce . United States: Conference Board: Partnership for 21st Century Skills: Corporate Voices for Working Families: Society for Human Resource Management.

Díaz, D., Rodríguez-Carvajal, R., Blanco, A., Moreno-Jiménez, B., Gallardo, I., Valle, C., et al. (2006). Spanish adaptation of the psychological well-being scales (PWBS). Psicothema 18, 572–577

PubMed Abstract | Google Scholar

Disabato, D. J., Goodman, F. R., Kashdan, T. B., Short, J. L., and Jarden, A. (2016). Different types of well-being? A cross-cultural examination of hedonic and Eudaimonic well-being. Psychol. Assess. 28, 471–482. doi: 10.1037/pas0000209

Dumitru, D., and Halpern, D. F. (2023). Critical thinking: creating job-proof skills for the future of work. J. Intelligence 11:194. doi: 10.3390/jintelligence11100194

Dwyer, C. P. (2023). An evaluative review of barriers to critical thinking in educational and real-world settings. J. Intelligence 11:105. doi: 10.3390/jintelligence11060105

Dwyer, C. P., Michael, J. H., and Ian, S. (2014). “An Integrated Critical Thinking Framework for the 21st Century.” Thinking Skills and Creativity 12, 43–52. doi: 10.1016/j.tsc.2013.12.004

Elder, L. (1997). Critical thinking: the key to emotional intelligence. J. Dev. Educ. Available at: https://api.semanticscholar.org/CorpusID:140360538

Ennis, R. H. (2015). “Critical thinking: a streamlined conception” in The Palgrave handbook of critical thinking in higher education . eds. M. Davies and R. Barnett (New York: Palgrave Macmillan US), 31–47.

Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction. ‘The Delphi Report.’ : The California Academic Press Available at: https://files.eric.ed.gov/fulltext/ED315423.pdf .

Flanagan, O., LeDoux, J. E., Bingle, B., Haybron, D. M., Mesquita, B., Moody-Adams, M., et al. (2023). Against Happiness . New York: Columbia University Press.

Frick, B., and Maihaus, M. (2016). The structure and determinants of expected and actual starting salaries of higher education students in Germany: identical or different? Educ. Econ. 24, 374–392. doi: 10.1080/09645292.2015.1110115

Gilar-Corbi, R., Pozo-Rico, T., Castejón, J.-L., Sánchez, T., Sandoval-Palis, I., and Vidal, J. (2020). Academic achievement and failure in university studies: motivational and emotional factors. Sustainability 12:9798. doi: 10.3390/su12239798

Guamanga, M. H., González, F. A., Saiz, C., and Rivas, S. F. (2023). Critical thinking: the ARDESOS-DIAPROVE program in dialogue with the inference to the best and only explanation. J. Intelligence 11:226. doi: 10.3390/jintelligence11120226

Halpern, D. F. (1998). Teaching critical thinking for transfer across domains: disposition, skills, structure training, and metacognitive monitoring. Am. Psychol. 53, 449–455. doi: 10.1037/0003-066X.53.4.449

Halpern, D. F., and Butler, H. A. (2018). “Is critical thinking a better model of intelligence?” in The nature of human intelligence . ed. R. J. Sternberg (New York, NY, US: Cambridge University Press), 183–196.

Halpern, D. F., and Dunn, D. S. (2023). Thought and knowledge: an introduction to critical thinking . 6th Edn. New York: Routledge.

Hanna, E. P. (2013). A cognitive emotional methodology for critical thinking. Adv. Appl. Sociol. 3, 20–25. doi: 10.4236/aasoci.2013.31003

Huppert, F. A. (2009). Psychological Well‐being: Evidence Regarding its Causes and Consequences. Appl. Psychol.: Health and Well-Being 1, 137–64. doi: 10.1111/j.1758-0854.2009.01008.x

Jahn, D. (2019). “Verantwortung für das eigene Denken und Handeln übernehmen: Impulse zur Förderung von kritischem Denken in der Lehre” in Kritische Hochschullehre . eds. D. Jahn, A. Kenner, D. Kergel, and B. Heidkamp-Kergel (Wiesbaden: Springer Fachmedien Wiesbaden), 19–46.

Jahn, D., and Kenner, A. (2018). “Critical thinking in higher education: how to Foster it using digital media” in The digital turn in higher education . eds. D. Kergel, B. Heidkamp, P. K. Telléus, T. Rachwal, and S. Nowakowski (Wiesbaden: Springer Fachmedien Wiesbaden), 81–109.

Mammadov, S. (2022). Big five personality traits and academic performance: a Meta-analysis. J. Pers. 90, 222–255. doi: 10.1111/jopy.12663

Morales-Vives, F., Camps, E., and Dueñas, J. M. (2020). Predicting academic achievement in adolescents: the role of maturity, intelligence and personality. Psicothema 32, 84–91. doi: 10.7334/psicothema2019.262

Nogueira, C., Risoto, M. A., and Meier, L. K. (2023). Evaluación de La Escala de Bienestar Psicológico de Ryff Para Estudiantes Universitarios. Revista Iberoamericana de Diagnóstico y Evaluación – e Avaliação Psicológica 70, 161–174. doi: 10.21865/RIDEP70.4.12

Oliván Blázquez, B., Masluk, B., Gascon, S., Fueyo Díaz, R., Aguilar-Latorre, A., Artola Magallón, I., et al. (2019). The use of flipped classroom as an active learning approach improves academic performance in social work: a randomized trial in a university. PLoS One 14, 1–15. doi: 10.1371/journal.pone.0214623

Páez-Gallego, J., Gallardo-López, J. A., López-Noguero, F., and Rodrigo-Moriche, M. P. (2020). Analysis of the relationship between psychological well-being and decision making in adolescent students. Front. Psychol. 11:1195. doi: 10.3389/fpsyg.2020.01195

Rivas, S. F., and Saiz, C. (2012). Validación y Propiedades Psicométricas de La Prueba de Pensamiento Crítico PENCRISAL. Revista Electrónica de Metodología Aplicada 17, 18–34.

Rivas, S. F., Saiz, C., and Almeida, L. S. (2023). The role of critical thinking in predicting and improving academic performance. Sustainability 15:1527. doi: 10.3390/su15021527

Ryff, C. D. (1989). Happiness is everything, or is it? Explorations on the meaning of psychological well-being. J. Pers. Soc. Psychol. 57, 1069–1081. doi: 10.1037/0022-3514.57.6.1069

Ryff, C. D. (2013). Psychological well-being revisited: advances in the science and practice of Eudaimonia. Psychother. Psychosom. 83, 10–28. doi: 10.1159/000353263

Ryff, C. D., and Keyes, C. L. M. (1995). The structure of psychological well-being revisited. J. Pers. Soc. Psychol. 69, 719–727. doi: 10.1037/0022-3514.69.4.719

Saiz, C. (2020). Pensamiento crítico y eficacia . 2nd Edn. Madrid: Ediciones Pirámide.

Saiz, C. (2024). “Pensamiento crítico y cambio” in Psicología . 2nd ed (Madrid: Ediciones Pirámide).

Saiz, C., and Rivas, S. F. (2008). Evaluación En Pensamiento Crítico: Una Propuesta Para Diferenciar Formas de Pensar. Ergo, Nueva Época 22, 25–66.

Saiz, C., and Rivas, S. F. (2023). Critical thinking, formation, and change. J. Intelligence 11:219. doi: 10.3390/jintelligence11120219

Sternberg, R. J. (1985). Beyond IQ: a Triarchic theory of human intelligence . New York, NY, US: Cambridge University Press.

Van Dierendonck, D., and Lam, H. (2023). Interventions to enhance eudaemonic psychological well-being: a meta-analytic review with Ryff’s scales of psychological well-being. Appl. Psychol. Health Well Being 15, 594–610. doi: 10.1111/aphw.12398

Keywords: critical thinking, psychological well-being, academic performance, higher education, assessment

Citation: Guamanga MH, Saiz C, Rivas SF and Almeida LS (2024) Analysis of the contribution of critical thinking and psychological well-being to academic performance. Front. Educ . 9:1423441. doi: 10.3389/feduc.2024.1423441

Received: 25 April 2024; Accepted: 28 June 2024; Published: 16 July 2024.

Reviewed by:

Copyright © 2024 Guamanga, Saiz, Rivas and Almeida. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Carlos Saiz, [email protected]

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

IMAGES

  1. Critical Thinking Skills

    critical thinking learning outcomes writing

  2. Critical Thinking

    critical thinking learning outcomes writing

  3. 5+ Critical Thinking Strategies for your Essay (2024)

    critical thinking learning outcomes writing

  4. How To Write Learning Outcomes

    critical thinking learning outcomes writing

  5. Educational Classroom Posters And Resources

    critical thinking learning outcomes writing

  6. 5+ Critical Thinking Strategies for your Essay (2023)

    critical thinking learning outcomes writing

VIDEO

  1. Learn from Mistakes or Study Others' Errors First? #shorts

  2. Lucy Calkins on Her New Writing Series

  3. Critical Thinking

  4. Understanding the Critical Reading & Writing Skills Test

  5. Learning oriented assessment, critical thinking and English language speaking skills with Dr Mansoor

  6. What does critical thinking involve? #literacy #criticalthinking

COMMENTS

  1. Writing Student Learning Outcomes

    Academic Program Learning Outcomes. The following examples of academic program student learning outcomes come from a variety of academic programs across campus, and are organized in four broad areas: 1) contextualization of knowledge; 2) praxis and technique; 3) critical thinking; and, 4) research and communication.

  2. Writing and Assessing Student Learning Outcomes

    Student learning outcomes also: Should align with the institution's curriculum and co-curriculum outcomes (Maki, 2010) Should be collaboratively authored and collectively accepted (Maki, 2010) Should incorporate or adapt professional organizations outcome statements when they exist (Maki, 2010) Can be quantitatively and/or qualitatively ...

  3. Creating Learning Outcomes

    Learning outcomes benefit instructors. Learning outcomes can help instructors in a number of ways by: Providing a framework and rationale for making course design decisions about the sequence of topics and instruction, content selection, and so on. Communicating to students what they must do to make progress in learning in your course.

  4. PDF WRITING LEARNING OUTCOMES

    7. • In writing learning outcomes it is important to keep in mind that we assess what is taught . Find learning outcomes that are a source of objective data for assessment, indicating clearly what learners have to understand, know and/or be able to do . All learning outcomes have to be observable and measurable .

  5. Using Bloom's Taxonomy to Write Effective Learning Outcomes

    Strive to keep all your learning outcomes measurable, clear and concise. When you are ready to write, it can be helpful to list the level of Bloom's next to the verb you choose in parentheses. For example: Course level outcome 1. (apply) Demonstrate how transportation is a critical link in the supply chain. 1.1.

  6. PDF Holistically Assessing Critical Thinking and Written Communication ...

    Written Communication Learning Outcomes ... students' writing skills and their critical thinking capabilities (e.g., National Center for Education Statistics, 1993, 1994, 1995), whereas other research points to the independence ... seniors' attainment of written communication and critical thinking outcomes. The data collected through these ...

  7. PDF A Guide for Writing Student Learning Outcomes

    Learning Outcomes at UF are organized into three broad categories: Content Knowledge, Critical Thinking, and Communication. In order to demonstrate SACS standard 3.3.1, student learning outcomes should be reviewed periodically and realigned with the evolving outcomes and expectations for students graduating from UF.

  8. PDF WRITING LEARNING OUTCOMES 3. Well-written Learning Outcomes 4. Writing

    Learning Outcome 1. Upon completion of this unit, students will be able to critically assess the factors influencing physical activity and nutrition (environment, community, habits, underlying thinking, lifestyle and time allocation), and analyze how these factors shape their current choices.

  9. Unlocking writing success: Building assessment literacy for students

    The framework aligns intending to improve student learning outcomes by enhancing teachers' ability to design, ... AL is essential to unlocking writing success, and by taking appropriate interventions, students can improve their writing skills, develop critical thinking, and develop problem-solving abilities while also being motivated to achieve ...

  10. Learning outcomes and critical thinking

    The notion of critical thinking and its theoretical complexity are used as a case for an epistemological critique of the model of intended learning outcomes. The conclusion is that three problems of learning outcomes, previously discussed in the literature, become even more challenging when seen in the light of critical thinking.

  11. Writing to Think: Critical Thinking and the Writing Process

    "Writing is thinking on paper." (Zinsser, 1976, p. vii) Google the term "critical thinking." How many hits are there? On the day this tutorial was completed, Google found about 65,100,000 results in 0.56 seconds. That's an impressive number, and it grows more impressively large every day. That's because the nation's educators, business leaders, and political…

  12. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  13. PDF Writing learning outcomes

    focussed on the learner. It is performance-oriented, typically beginning with an action verb (e.g. 'demonstrate', 'apply'), typically written in the future tense, typically relating the action to the unit content, and typically specifying the des. ed level of performance. A learning outcome should be a clear statement of what the ...

  14. PDF Writing Measurable and Meaningful Outcomes

    you could write a learning outcome that looks similar to the following: Students participating in the SL206 Service Learning through Music Therapy course will demonstrate an increase in critical thinking skills, as exhibited by an improvement in scores on the Chronicle Critical Thinking pre-test/post-test instrument.

  15. Learning Through Writing Boosts Student Success

    Increase Critical Thinking through Writing tackles the vital role of writing for enhancing literacy as a comprehensive strategy, facilitating improved understanding across various disciplines. This approach advocates for the integration of engaging writing tasks within every lesson, aiming to augment academic performance by prompting students ...

  16. Writing Learning Outcomes

    Writing Learning Outcomes. Once the promise of a course is understood and articulated, it is easier to talk about the student learning goals, which are typically written out in the form of learning outcomes. ... Think about not only what knowledge students should gain, but what skills they will be developing (critical thinking skills, creative ...

  17. Learning to Improve: Using Writing to Increase Critical Thinking

    Increasingly, national stakeholders express concern that U.S. college graduates cannot adequately solve problems and think critically. As a set of cognitive abilities, critical thinking skills provide students with tangible academic, personal, and professional benefits that may ultimately address these concerns. As an instructional method, writing has long been perceived as a way to improve ...

  18. PDF What Are Student Learning Outcomes?

    What Are Student Learning Outcomes?Learning outcomes are statements of the knowledge, skills and abilities individual students should possess and can demonstrate upon completion of a learning experience. r sequence of learning experiences. Before preparing a list of learning outcomes consi. er the following recommendations: Learning outcomes.

  19. CTL Guide to Critical Thinking

    Introduction. The ability to think critically is the fundamental characteristic of an educated person. It is required for just, civil society and governance, prized by employers, and essential for the growth of wisdom. Critical thinking is what most people name first when asked about the essential components of a college education.

  20. Eight Instructional Strategies for Promoting Critical Thinking

    At Avenues World School, critical thinking is one of the Avenues World Elements and is an enduring outcome embedded in students' early experiences through 12th grade. For instance, a ...

  21. Essential Learning Outcomes: Critical/Creative Thinking

    Guide to Critical/Creative Thinking. Intended Learning Outcome: Analyze, evaluate, and synthesize information in order to consider problems/ideas and transform them in innovative or imaginative ways (See below for definitions) Assessment may include but is not limited to the following criteria and intended outcomes:

  22. Emerson College Library: Learning Outcomes: A Basic Guide to SLOs

    Write learning outcomes that are too narrow: Avoid writing outcomes that are too narrow or focused on specific tasks, as this can limit the scope of learning. Write learning outcomes that are too broad: Learning outcomes that are too broad make it difficult to assess whether students have achieved the intended learnin .

  23. Learning outcomes and critical thinking

    ABSTRACT. The notion of critical thinking and its theoretical complexity are used as a case for an epistemological critique of the model of intended learning outcomes. The conclusion is that three problems of learning outcomes, previously discussed in the literature, become even more challenging when seen in the light of critical thinking.

  24. Critical writing

    You can apply our critical thinking model to your own work; use our Judging your own work (PDF) questions to help you decide if your writing is critical. These questions will take you through the description-analysis-evaluation stages. Take a look at further examples of descriptive writing vs. critical/analytical writing (PDF)

  25. Learning to Improve: Using Writing to Increase Critical Thinking

    Critical Thinking. Although they are not always transparent to many college students, the academic and personal benefits of critical thinking are well established; students who can think critically tend to get better grades, are often better able to use reasoning in daily decisions (U.S. Department of Education, 1990), and are generally more employable (Carnevale and American Society for ...

  26. Critical Thinking Is All About "Connecting the Dots"

    Key points. Critical thinking requires us to simultaneously analyze and interpret different pieces of information. To effectively interpret information, one must first be able to remember it.

  27. Frontiers

    The instruments applied were Ryff's PWBS in its Spanish adaptation (Díaz et al., 2006) and the PENCRISAL critical thinking test (Saiz and Rivas, 2008; Rivas and Saiz, 2012). For academic performance, the academic records of the students participating in the critical thinking course in the first year of the psychology graduation were collected.