Logo for VIVA Open Publishing

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

16 16. Reporting quantitative results

Chapter outline.

  • Reporting quantitative results (8 minute read time)

Content warning: Brief discussion of violence against women.

16.1 Reporting quantitative results

Learning objectives.

Learners will be able to…

  • Execute a quantitative research report using key elements for accuracy and openness

So you’ve completed your quantitative analyses and are ready to report your results. We’re going to spend some time talking about what matters in quantitative research reports, but the very first thing to understand is this: openness with your data and analyses is key. You should never hide what you did to get to a particular conclusion and, if someone wanted to and could ethically access your data, they should be able to replicate more or less exactly what you did. While your quantitative report won’t have every single step you took to get to your conclusion, it should have plenty of detail so someone can get the picture.

Below, I’m going to take you through the key elements of a quantitative research report. This overview is pretty general and conceptual, and it will be helpful for you to look at existing scholarly articles that deal with quantitative research (like ones in your literature review) to see the structure applied. Also keep in mind that your instructor may want the sections broken out slightly differently; nonetheless, the content I outline below should be in your research report.

Introduction and literature review

These are what you’re working on building with your research proposal this semester. They should be included as part of your research report so that readers have enough information to evaluate your research for themselves. What’s here should be very similar to the introduction and literature review from your research proposal, where you described the literature relevant to the study you wanted to do. With your results in hand, though, you may find that you have to add information to the literature you wrote previously to help orient the reader of the report to important topics needed to understand the results of your study.

In this section, you should explicitly lay out your study design – for instance, if it was experimental, be specific about the type of experimental design. Discuss the type of sampling that you used, if that’s applicable to your project. You should also go into a general description of your data, including the time period, any exclusions you made from the original data set and the source – i.e., did you collect it yourself or was it secondary data?  Next, talk about the specific statistical methods you used, like t- tests, Chi-square tests, or regression analyses. For descriptive statistics, you can be relatively general – you don’t need to say “I looked at means and medians,” for instance. You need to provide enough information here that someone could replicate what you did.

In this section, you should also discuss how you operationalized your variables. What did you mean when you asked about educational attainment – did you ask for a grade number, or did you ask them to pick a range that you turned into a category? This is key information for readers to understand your research. Remember when you were looking for ways to operationalize your variables? Be the kind of author who provides enough information on operationalization so people can actually understand what they did.

You’re going to run lots of different analyses to settle on what finally makes sense to get a result – positive or negative – for your study. For this section, you’re going to provide tables with descriptions of your sample, including, but not limited to, sample size, frequencies of sample characteristics like race and gender, levels of measurement, appropriate measures of central tendency, standard deviations and variances. Here you will also want to focus on the analyses you used to actually draw whatever conclusion you settled on, both descriptive and inferential (i.e., bivariate or multivariate).

The actual statistics you report depend entirely on the kind of statistical analysis you do. For instance, if you’re reporting on a logistic regression, it’s going to look a little different than reporting on an ANOVA. In the previous chapter, we provided links to open textbooks that detail how to conduct quantitative data analysis. You should look at these resources and consult with your research professor to help you determine what is expected in a report about the particular statistical method you used.

The important thing to remember here – as we mentioned above – is that you need to be totally transparent about your results, even and especially if they don’t support your hypothesis. There is value in a disproved hypothesis, too – you now know something about how the state of the world is not .

In this section, you’re going to connect your statistical results back to your hypothesis and discuss whether your results support your hypothesis or not. You are also going to talk about what the results mean for the larger field of study of which your research is a part, the implications of your findings if you’re evaluating some kind of intervention, and how your research relates to what is already out there in this field. When your research doesn’t pan out the way you expect, if you’re able to make some educated guesses as to why this might be (supported by literature if possible, but practice wisdom works too), share those as well.

Let’s take a minute to talk about what happens when your findings disprove your hypothesis or actually indicate something negative about the group you are studying. The discussion section is where you can contextualize “negative” findings. For example, say you conducted a study that indicated that a certain group is more likely to commit violent crime. Here, you have an opportunity to talk about why this might be the case outside of their membership in that group, and how membership in that group does not automatically mean someone will commit a violent crime. You can present mitigating factors, like a history of personal and community trauma. It’s extremely important to provide this relevant context so that your results are more difficult to use against a group you are studying in a way that doesn’t reflect your actual findings.

Limitations

In this section, you’re going to critique your own study. What are the advantages, disadvantages, and trade-offs of what you did to define and analyze your variables? Some questions you might consider include:  What limits the study’s applicability to the population at large? Were there trade-offs you had to make between rigor and available data? Did the statistical analyses you used mean that you could only get certain types of results? What would have made the study more widely applicable or more useful for a certain group? You should be thinking about this throughout the analysis process so you can properly contextualize your results.

In this section, you may also consider discussing any threats to internal validity that you identified and whether you think you can generalize your research. Finally, if you used any measurement tools that haven’t been validated yet, discuss how this could have affected your results.

Significance and conclusions

Finally, you want to use the conclusions section to bring it full circle for your reader – why did this research matter? Talk about how it contributed to knowledge around the topic and how might it be used to further practice. Identify and discuss ethical implications of your findings for social workers and social work research. Finally, make sure to talk about the next steps for you, other researchers, or policy-makers based on your research findings.

Key Takeaways

  • Your quantitative research report should provide the reader with transparent, replicable methods and put your research into the context of existing literature, real-world practice and social work ethics.
  • Think about the research project you are building now. What could a negative finding be, and how might you provide your reader with context to ensure that you are not harming your study population?

The process of determining how to measure a construct that cannot be directly observed

Ability to say that one variable "causes" something to happen to another variable. Very important to assess when thinking about studies that examine causation such as experimental or quasi-experimental designs.

Graduate research methods in social work Copyright © 2020 by Matthew DeCarlo, Cory Cummings, Kate Agnelli is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

  • USC Libraries
  • Research Guides

Organizing Your Social Sciences Research Paper

  • Quantitative Methods
  • Purpose of Guide
  • Design Flaws to Avoid
  • Independent and Dependent Variables
  • Glossary of Research Terms
  • Reading Research Effectively
  • Narrowing a Topic Idea
  • Broadening a Topic Idea
  • Extending the Timeliness of a Topic Idea
  • Academic Writing Style
  • Choosing a Title
  • Making an Outline
  • Paragraph Development
  • Research Process Video Series
  • Executive Summary
  • The C.A.R.S. Model
  • Background Information
  • The Research Problem/Question
  • Theoretical Framework
  • Citation Tracking
  • Content Alert Services
  • Evaluating Sources
  • Primary Sources
  • Secondary Sources
  • Tiertiary Sources
  • Scholarly vs. Popular Publications
  • Qualitative Methods
  • Insiderness
  • Using Non-Textual Elements
  • Limitations of the Study
  • Common Grammar Mistakes
  • Writing Concisely
  • Avoiding Plagiarism
  • Footnotes or Endnotes?
  • Further Readings
  • Generative AI and Writing
  • USC Libraries Tutorials and Other Guides
  • Bibliography

Quantitative methods emphasize objective measurements and the statistical, mathematical, or numerical analysis of data collected through polls, questionnaires, and surveys, or by manipulating pre-existing statistical data using computational techniques . Quantitative research focuses on gathering numerical data and generalizing it across groups of people or to explain a particular phenomenon.

Babbie, Earl R. The Practice of Social Research . 12th ed. Belmont, CA: Wadsworth Cengage, 2010; Muijs, Daniel. Doing Quantitative Research in Education with SPSS . 2nd edition. London: SAGE Publications, 2010.

Need Help Locating Statistics?

Resources for locating data and statistics can be found here:

Statistics & Data Research Guide

Characteristics of Quantitative Research

Your goal in conducting quantitative research study is to determine the relationship between one thing [an independent variable] and another [a dependent or outcome variable] within a population. Quantitative research designs are either descriptive [subjects usually measured once] or experimental [subjects measured before and after a treatment]. A descriptive study establishes only associations between variables; an experimental study establishes causality.

Quantitative research deals in numbers, logic, and an objective stance. Quantitative research focuses on numeric and unchanging data and detailed, convergent reasoning rather than divergent reasoning [i.e., the generation of a variety of ideas about a research problem in a spontaneous, free-flowing manner].

Its main characteristics are :

  • The data is usually gathered using structured research instruments.
  • The results are based on larger sample sizes that are representative of the population.
  • The research study can usually be replicated or repeated, given its high reliability.
  • Researcher has a clearly defined research question to which objective answers are sought.
  • All aspects of the study are carefully designed before data is collected.
  • Data are in the form of numbers and statistics, often arranged in tables, charts, figures, or other non-textual forms.
  • Project can be used to generalize concepts more widely, predict future results, or investigate causal relationships.
  • Researcher uses tools, such as questionnaires or computer software, to collect numerical data.

The overarching aim of a quantitative research study is to classify features, count them, and construct statistical models in an attempt to explain what is observed.

  Things to keep in mind when reporting the results of a study using quantitative methods :

  • Explain the data collected and their statistical treatment as well as all relevant results in relation to the research problem you are investigating. Interpretation of results is not appropriate in this section.
  • Report unanticipated events that occurred during your data collection. Explain how the actual analysis differs from the planned analysis. Explain your handling of missing data and why any missing data does not undermine the validity of your analysis.
  • Explain the techniques you used to "clean" your data set.
  • Choose a minimally sufficient statistical procedure ; provide a rationale for its use and a reference for it. Specify any computer programs used.
  • Describe the assumptions for each procedure and the steps you took to ensure that they were not violated.
  • When using inferential statistics , provide the descriptive statistics, confidence intervals, and sample sizes for each variable as well as the value of the test statistic, its direction, the degrees of freedom, and the significance level [report the actual p value].
  • Avoid inferring causality , particularly in nonrandomized designs or without further experimentation.
  • Use tables to provide exact values ; use figures to convey global effects. Keep figures small in size; include graphic representations of confidence intervals whenever possible.
  • Always tell the reader what to look for in tables and figures .

NOTE:   When using pre-existing statistical data gathered and made available by anyone other than yourself [e.g., government agency], you still must report on the methods that were used to gather the data and describe any missing data that exists and, if there is any, provide a clear explanation why the missing data does not undermine the validity of your final analysis.

Babbie, Earl R. The Practice of Social Research . 12th ed. Belmont, CA: Wadsworth Cengage, 2010; Brians, Craig Leonard et al. Empirical Political Analysis: Quantitative and Qualitative Research Methods . 8th ed. Boston, MA: Longman, 2011; McNabb, David E. Research Methods in Public Administration and Nonprofit Management: Quantitative and Qualitative Approaches . 2nd ed. Armonk, NY: M.E. Sharpe, 2008; Quantitative Research Methods. Writing@CSU. Colorado State University; Singh, Kultar. Quantitative Social Research Methods . Los Angeles, CA: Sage, 2007.

Basic Research Design for Quantitative Studies

Before designing a quantitative research study, you must decide whether it will be descriptive or experimental because this will dictate how you gather, analyze, and interpret the results. A descriptive study is governed by the following rules: subjects are generally measured once; the intention is to only establish associations between variables; and, the study may include a sample population of hundreds or thousands of subjects to ensure that a valid estimate of a generalized relationship between variables has been obtained. An experimental design includes subjects measured before and after a particular treatment, the sample population may be very small and purposefully chosen, and it is intended to establish causality between variables. Introduction The introduction to a quantitative study is usually written in the present tense and from the third person point of view. It covers the following information:

  • Identifies the research problem -- as with any academic study, you must state clearly and concisely the research problem being investigated.
  • Reviews the literature -- review scholarship on the topic, synthesizing key themes and, if necessary, noting studies that have used similar methods of inquiry and analysis. Note where key gaps exist and how your study helps to fill these gaps or clarifies existing knowledge.
  • Describes the theoretical framework -- provide an outline of the theory or hypothesis underpinning your study. If necessary, define unfamiliar or complex terms, concepts, or ideas and provide the appropriate background information to place the research problem in proper context [e.g., historical, cultural, economic, etc.].

Methodology The methods section of a quantitative study should describe how each objective of your study will be achieved. Be sure to provide enough detail to enable the reader can make an informed assessment of the methods being used to obtain results associated with the research problem. The methods section should be presented in the past tense.

  • Study population and sampling -- where did the data come from; how robust is it; note where gaps exist or what was excluded. Note the procedures used for their selection;
  • Data collection – describe the tools and methods used to collect information and identify the variables being measured; describe the methods used to obtain the data; and, note if the data was pre-existing [i.e., government data] or you gathered it yourself. If you gathered it yourself, describe what type of instrument you used and why. Note that no data set is perfect--describe any limitations in methods of gathering data.
  • Data analysis -- describe the procedures for processing and analyzing the data. If appropriate, describe the specific instruments of analysis used to study each research objective, including mathematical techniques and the type of computer software used to manipulate the data.

Results The finding of your study should be written objectively and in a succinct and precise format. In quantitative studies, it is common to use graphs, tables, charts, and other non-textual elements to help the reader understand the data. Make sure that non-textual elements do not stand in isolation from the text but are being used to supplement the overall description of the results and to help clarify key points being made. Further information about how to effectively present data using charts and graphs can be found here .

  • Statistical analysis -- how did you analyze the data? What were the key findings from the data? The findings should be present in a logical, sequential order. Describe but do not interpret these trends or negative results; save that for the discussion section. The results should be presented in the past tense.

Discussion Discussions should be analytic, logical, and comprehensive. The discussion should meld together your findings in relation to those identified in the literature review, and placed within the context of the theoretical framework underpinning the study. The discussion should be presented in the present tense.

  • Interpretation of results -- reiterate the research problem being investigated and compare and contrast the findings with the research questions underlying the study. Did they affirm predicted outcomes or did the data refute it?
  • Description of trends, comparison of groups, or relationships among variables -- describe any trends that emerged from your analysis and explain all unanticipated and statistical insignificant findings.
  • Discussion of implications – what is the meaning of your results? Highlight key findings based on the overall results and note findings that you believe are important. How have the results helped fill gaps in understanding the research problem?
  • Limitations -- describe any limitations or unavoidable bias in your study and, if necessary, note why these limitations did not inhibit effective interpretation of the results.

Conclusion End your study by to summarizing the topic and provide a final comment and assessment of the study.

  • Summary of findings – synthesize the answers to your research questions. Do not report any statistical data here; just provide a narrative summary of the key findings and describe what was learned that you did not know before conducting the study.
  • Recommendations – if appropriate to the aim of the assignment, tie key findings with policy recommendations or actions to be taken in practice.
  • Future research – note the need for future research linked to your study’s limitations or to any remaining gaps in the literature that were not addressed in your study.

Black, Thomas R. Doing Quantitative Research in the Social Sciences: An Integrated Approach to Research Design, Measurement and Statistics . London: Sage, 1999; Gay,L. R. and Peter Airasain. Educational Research: Competencies for Analysis and Applications . 7th edition. Upper Saddle River, NJ: Merril Prentice Hall, 2003; Hector, Anestine. An Overview of Quantitative Research in Composition and TESOL . Department of English, Indiana University of Pennsylvania; Hopkins, Will G. “Quantitative Research Design.” Sportscience 4, 1 (2000); "A Strategy for Writing Up Research Results. The Structure, Format, Content, and Style of a Journal-Style Scientific Paper." Department of Biology. Bates College; Nenty, H. Johnson. "Writing a Quantitative Research Thesis." International Journal of Educational Science 1 (2009): 19-32; Ouyang, Ronghua (John). Basic Inquiry of Quantitative Research . Kennesaw State University.

Strengths of Using Quantitative Methods

Quantitative researchers try to recognize and isolate specific variables contained within the study framework, seek correlation, relationships and causality, and attempt to control the environment in which the data is collected to avoid the risk of variables, other than the one being studied, accounting for the relationships identified.

Among the specific strengths of using quantitative methods to study social science research problems:

  • Allows for a broader study, involving a greater number of subjects, and enhancing the generalization of the results;
  • Allows for greater objectivity and accuracy of results. Generally, quantitative methods are designed to provide summaries of data that support generalizations about the phenomenon under study. In order to accomplish this, quantitative research usually involves few variables and many cases, and employs prescribed procedures to ensure validity and reliability;
  • Applying well established standards means that the research can be replicated, and then analyzed and compared with similar studies;
  • You can summarize vast sources of information and make comparisons across categories and over time; and,
  • Personal bias can be avoided by keeping a 'distance' from participating subjects and using accepted computational techniques .

Babbie, Earl R. The Practice of Social Research . 12th ed. Belmont, CA: Wadsworth Cengage, 2010; Brians, Craig Leonard et al. Empirical Political Analysis: Quantitative and Qualitative Research Methods . 8th ed. Boston, MA: Longman, 2011; McNabb, David E. Research Methods in Public Administration and Nonprofit Management: Quantitative and Qualitative Approaches . 2nd ed. Armonk, NY: M.E. Sharpe, 2008; Singh, Kultar. Quantitative Social Research Methods . Los Angeles, CA: Sage, 2007.

Limitations of Using Quantitative Methods

Quantitative methods presume to have an objective approach to studying research problems, where data is controlled and measured, to address the accumulation of facts, and to determine the causes of behavior. As a consequence, the results of quantitative research may be statistically significant but are often humanly insignificant.

Some specific limitations associated with using quantitative methods to study research problems in the social sciences include:

  • Quantitative data is more efficient and able to test hypotheses, but may miss contextual detail;
  • Uses a static and rigid approach and so employs an inflexible process of discovery;
  • The development of standard questions by researchers can lead to "structural bias" and false representation, where the data actually reflects the view of the researcher instead of the participating subject;
  • Results provide less detail on behavior, attitudes, and motivation;
  • Researcher may collect a much narrower and sometimes superficial dataset;
  • Results are limited as they provide numerical descriptions rather than detailed narrative and generally provide less elaborate accounts of human perception;
  • The research is often carried out in an unnatural, artificial environment so that a level of control can be applied to the exercise. This level of control might not normally be in place in the real world thus yielding "laboratory results" as opposed to "real world results"; and,
  • Preset answers will not necessarily reflect how people really feel about a subject and, in some cases, might just be the closest match to the preconceived hypothesis.

Research Tip

Finding Examples of How to Apply Different Types of Research Methods

SAGE publications is a major publisher of studies about how to design and conduct research in the social and behavioral sciences. Their SAGE Research Methods Online and Cases database includes contents from books, articles, encyclopedias, handbooks, and videos covering social science research design and methods including the complete Little Green Book Series of Quantitative Applications in the Social Sciences and the Little Blue Book Series of Qualitative Research techniques. The database also includes case studies outlining the research methods used in real research projects. This is an excellent source for finding definitions of key terms and descriptions of research design and practice, techniques of data gathering, analysis, and reporting, and information about theories of research [e.g., grounded theory]. The database covers both qualitative and quantitative research methods as well as mixed methods approaches to conducting research.

SAGE Research Methods Online and Cases

  • << Previous: Qualitative Methods
  • Next: Insiderness >>
  • Last Updated: Oct 10, 2023 1:30 PM
  • URL: https://libguides.usc.edu/writingguide

Book cover

Principles of Social Research Methodology pp 257–260 Cite as

Techniques for Reporting Quantitative Data

  • Md. Mahsin 4  
  • First Online: 27 October 2022

962 Accesses

A quantitative research report is a way of describing the completed study to other people. The findings are communicated through an oral presentation, a book, or a published paper. The report disseminates the results to research scientists or the policy decision-maker’s stakeholders. It is usually written in plain words so that a layperson can understand it, or it may be so highly technical so that the target audience can understand it easily. It organizes in many different ways depending on the intended audience and the author’s style. A rough sequence of steps for writing a quantitative research report describes in this section:

Specify a summary or abstract of the report to give a quick picture of the research article, thesis, review paper, conference proceeding, or in-depth analysis of a particular subject.

Define the research problem and discuss the methodology approach.

Present the results and findings and finally summarize the significance of the conclusions.

  • Social research
  • Data reporting
  • Quantitative data

This is a preview of subscription content, access via your institution .

Buying options

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Neuman, W. L., & Robson, K. (2012). Basics of social research: Qualitative and quantitative approaches.

Google Scholar  

Download references

Author information

Authors and affiliations.

Department of Mathematics and Statistics, University of Calgary, 2500 University Drive NW, Calgary, Canada

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Md. Mahsin .

Editor information

Editors and affiliations.

Centre for Family and Child Studies, Research Institute of Humanities and Social Sciences, University of Sharjah, Sharjah, United Arab Emirates

M. Rezaul Islam

Department of Development Studies, University of Dhaka, Dhaka, Bangladesh

Niaz Ahmed Khan

Department of Social Work, School of Humanities, University of Johannesburg, Johannesburg, South Africa

Rajendra Baikady

Rights and permissions

Reprints and Permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Cite this chapter.

Mahsin, M. (2022). Techniques for Reporting Quantitative Data. In: Islam, M.R., Khan, N.A., Baikady, R. (eds) Principles of Social Research Methodology. Springer, Singapore. https://doi.org/10.1007/978-981-19-5441-2_17

Download citation

DOI : https://doi.org/10.1007/978-981-19-5441-2_17

Published : 27 October 2022

Publisher Name : Springer, Singapore

Print ISBN : 978-981-19-5219-7

Online ISBN : 978-981-19-5441-2

eBook Packages : Social Sciences

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Find a journal
  • Publish with us

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Dissertation
  • How to Write a Results Section | Tips & Examples

How to Write a Results Section | Tips & Examples

Published on August 30, 2022 by Tegan George . Revised on July 18, 2023.

A results section is where you report the main findings of the data collection and analysis you conducted for your thesis or dissertation . You should report all relevant results concisely and objectively, in a logical order. Don’t include subjective interpretations of why you found these results or what they mean—any evaluation should be saved for the discussion section .

Table of contents

How to write a results section, reporting quantitative research results, reporting qualitative research results, results vs. discussion vs. conclusion, checklist: research results, other interesting articles, frequently asked questions about results sections.

When conducting research, it’s important to report the results of your study prior to discussing your interpretations of it. This gives your reader a clear idea of exactly what you found and keeps the data itself separate from your subjective analysis.

Here are a few best practices:

  • Your results should always be written in the past tense.
  • While the length of this section depends on how much data you collected and analyzed, it should be written as concisely as possible.
  • Only include results that are directly relevant to answering your research questions . Avoid speculative or interpretative words like “appears” or “implies.”
  • If you have other results you’d like to include, consider adding them to an appendix or footnotes.
  • Always start out with your broadest results first, and then flow into your more granular (but still relevant) ones. Think of it like a shoe store: first discuss the shoes as a whole, then the sneakers, boots, sandals, etc.

Receive feedback on language, structure, and formatting

Professional editors proofread and edit your paper by focusing on:

  • Academic style
  • Vague sentences
  • Style consistency

See an example

writing a report quantitative

If you conducted quantitative research , you’ll likely be working with the results of some sort of statistical analysis .

Your results section should report the results of any statistical tests you used to compare groups or assess relationships between variables . It should also state whether or not each hypothesis was supported.

The most logical way to structure quantitative results is to frame them around your research questions or hypotheses. For each question or hypothesis, share:

  • A reminder of the type of analysis you used (e.g., a two-sample t test or simple linear regression ). A more detailed description of your analysis should go in your methodology section.
  • A concise summary of each relevant result, both positive and negative. This can include any relevant descriptive statistics (e.g., means and standard deviations ) as well as inferential statistics (e.g., t scores, degrees of freedom , and p values ). Remember, these numbers are often placed in parentheses.
  • A brief statement of how each result relates to the question, or whether the hypothesis was supported. You can briefly mention any results that didn’t fit with your expectations and assumptions, but save any speculation on their meaning or consequences for your discussion  and conclusion.

A note on tables and figures

In quantitative research, it’s often helpful to include visual elements such as graphs, charts, and tables , but only if they are directly relevant to your results. Give these elements clear, descriptive titles and labels so that your reader can easily understand what is being shown. If you want to include any other visual elements that are more tangential in nature, consider adding a figure and table list .

As a rule of thumb:

  • Tables are used to communicate exact values, giving a concise overview of various results
  • Graphs and charts are used to visualize trends and relationships, giving an at-a-glance illustration of key findings

Don’t forget to also mention any tables and figures you used within the text of your results section. Summarize or elaborate on specific aspects you think your reader should know about rather than merely restating the same numbers already shown.

A two-sample t test was used to test the hypothesis that higher social distance from environmental problems would reduce the intent to donate to environmental organizations, with donation intention (recorded as a score from 1 to 10) as the outcome variable and social distance (categorized as either a low or high level of social distance) as the predictor variable.Social distance was found to be positively correlated with donation intention, t (98) = 12.19, p < .001, with the donation intention of the high social distance group 0.28 points higher, on average, than the low social distance group (see figure 1). This contradicts the initial hypothesis that social distance would decrease donation intention, and in fact suggests a small effect in the opposite direction.

Example of using figures in the results section

Figure 1: Intention to donate to environmental organizations based on social distance from impact of environmental damage.

In qualitative research , your results might not all be directly related to specific hypotheses. In this case, you can structure your results section around key themes or topics that emerged from your analysis of the data.

For each theme, start with general observations about what the data showed. You can mention:

  • Recurring points of agreement or disagreement
  • Patterns and trends
  • Particularly significant snippets from individual responses

Next, clarify and support these points with direct quotations. Be sure to report any relevant demographic information about participants. Further information (such as full transcripts , if appropriate) can be included in an appendix .

When asked about video games as a form of art, the respondents tended to believe that video games themselves are not an art form, but agreed that creativity is involved in their production. The criteria used to identify artistic video games included design, story, music, and creative teams.One respondent (male, 24) noted a difference in creativity between popular video game genres:

“I think that in role-playing games, there’s more attention to character design, to world design, because the whole story is important and more attention is paid to certain game elements […] so that perhaps you do need bigger teams of creative experts than in an average shooter or something.”

Responses suggest that video game consumers consider some types of games to have more artistic potential than others.

Your results section should objectively report your findings, presenting only brief observations in relation to each question, hypothesis, or theme.

It should not  speculate about the meaning of the results or attempt to answer your main research question . Detailed interpretation of your results is more suitable for your discussion section , while synthesis of your results into an overall answer to your main research question is best left for your conclusion .

A faster, more affordable way to improve your paper

Scribbr’s new AI Proofreader checks your document and corrects spelling, grammar, and punctuation mistakes with near-human accuracy and the efficiency of AI!

writing a report quantitative

Proofread my paper

I have completed my data collection and analyzed the results.

I have included all results that are relevant to my research questions.

I have concisely and objectively reported each result, including relevant descriptive statistics and inferential statistics .

I have stated whether each hypothesis was supported or refuted.

I have used tables and figures to illustrate my results where appropriate.

All tables and figures are correctly labelled and referred to in the text.

There is no subjective interpretation or speculation on the meaning of the results.

You've finished writing up your results! Use the other checklists to further improve your thesis.

If you want to know more about AI for academic writing, AI tools, or research bias, make sure to check out some of our other articles with explanations and examples or go directly to our tools!

Research bias

  • Survivorship bias
  • Self-serving bias
  • Availability heuristic
  • Halo effect
  • Hindsight bias
  • Deep learning
  • Generative AI
  • Machine learning
  • Reinforcement learning
  • Supervised vs. unsupervised learning

 (AI) Tools

  • Grammar Checker
  • Paraphrasing Tool
  • Text Summarizer
  • AI Detector
  • Plagiarism Checker
  • Citation Generator

The results chapter of a thesis or dissertation presents your research results concisely and objectively.

In quantitative research , for each question or hypothesis , state:

  • The type of analysis used
  • Relevant results in the form of descriptive and inferential statistics
  • Whether or not the alternative hypothesis was supported

In qualitative research , for each question or theme, describe:

  • Recurring patterns
  • Significant or representative individual responses
  • Relevant quotations from the data

Don’t interpret or speculate in the results chapter.

Results are usually written in the past tense , because they are describing the outcome of completed actions.

The results chapter or section simply and objectively reports what you found, without speculating on why you found these results. The discussion interprets the meaning of the results, puts them in context, and explains why they matter.

In qualitative research , results and discussion are sometimes combined. But in quantitative research , it’s considered important to separate the objective results from your interpretation of them.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

George, T. (2023, July 18). How to Write a Results Section | Tips & Examples. Scribbr. Retrieved November 3, 2023, from https://www.scribbr.com/dissertation/results/

Is this article helpful?

Tegan George

Tegan George

Other students also liked, what is a research methodology | steps & tips, how to write a discussion section | tips & examples, how to write a thesis or dissertation conclusion, what is your plagiarism score.

Logo for VCU's Press Books

Part 3: Using quantitative methods

16. Reporting quantitative results

Chapter outline.

  • Additional resources for quantitative data analysis (3 minute read)
  • Reporting quantitative results (8 minute read)

Content warning: examples in this chapter contain a brief discussion of violence against women.

16.1 Additional resources for quantitative data analysis

Learning objectives.

Learners will be able to…

  • Identify open textbooks and resources to assist with statistical analysis.
  • Identify open source and commercial software used to perform statistical analysis.

While you are affiliated with a university, it is likely that you will have access to some kind of commercial statistics software. Examples in the previous section uses SPSS, the most common one our authoring team has seen in social work education. Like its competitors SAS and STATA, SPSS is expensive and your license to the software must be renewed every year (like a subscription). Even if you are able to install commercial statistics software on your computer, once your license expires, your program will no longer work. We believe that forcing students to learn software they will never use is wasteful and contributes to the (accurate, in many cases) perception from students that research class is unrelated to real-world practice. SPSS is more accessible due to its graphical user interface and does not require researcher to learn to learn basic computer programming, but it is prohibitively costly if a student wanted to use it to measure practice data in their agency post-graduation.

Instead, we suggest getting familiar with JASP Statistics , a free and open-source alternative to SPSS developed and supported by the University of Amsterdam. It has a similar user interface as SPSS, and should be similarly easy to learn. Moreover, usability upgrades from SPSS like generating APA formatted tables make it a compelling option. While a great many of my students will rely on statistical analyses of their programs and practices in reports to funders, it is unlikely that any will use SPSS. Browse JASP’s how-to guide or consult this textbook Learning Statistics with JASP: A Tutorial for Psychology Students and Other Beginners , written by  Danielle J. Navarro ,  David R. Foxcroft , and  Thomas J. Faulkenberry .

Another open source statistics software package is R (a.k.a. The R Project for Statistical Computing ). R uses a command line interface, so you will need to learn how to program computer code in order to use it. Luckily, R is the most commonly used statistics software in the world, and the community of support and guides for using R are omnipresent online. For beginning researchers, consult the textbook  Learning Statistics with R: A tutorial for psychology students and other beginners by Danielle J. Navarro .

While statistics software is sometimes needed to perform advanced statistical tests, most univariate and bivariate tests can be performed in spreadsheet software like Microsoft Excel, Google Sheets, or the free and open source LibreOffice Calc . Microsoft offers a includes a ToolPak to perform complex data analysis as an add-on to Excel. For more information on using spreadsheet software to perform statistics, the open textbook Collaborative Statistics Using Spreadsheets by Susan Dean, Irene Mary Duranczyk, Barbara Illowsky, Suzanne Loch, and Janet Stottlemyer.

Statistical analysis is performed in just about every discipline, and as a result, there are a lot of openly licensed, free resources to assist you with your data analysis. We have endeavored to provide you the basics in the past few chapters, but ultimately, you will likely need additional support in completing quantitative data analysis from an instructor, textbook, or other resource. Browse the Open Textbook Library for statistics resources or look for video tutorials from reputable instructors like this video textbook on statistics by Bryan Koenig .

Key Takeaways

  • While the statistics software your school purchases is very expensive, there are free and easy-to-use alternatives you can learn and continue to use post-graduation.
  • There are a lot of high quality and free online resources to learn and perform statistical analysis.

16.2 Reporting quantitative results

  • Write a comprehensive and reputable quantitative research report

So you’ve completed your quantitative analyses and are ready to report your results. We’re going to spend some time talking about what matters in quantitative research reports, but the very first thing to understand is this: openness with your data and analyses is key. You should never hide what you did to get to a particular conclusion and, if someone wanted to and could ethically access your data, they should be able to replicate more or less exactly what you did. While your quantitative report won’t have every single step you took to get to your conclusion, it should have plenty of detail so someone can get the picture.

Below, I’m going to take you through the key elements of a quantitative research report. This overview is pretty general and conceptual, and it will be helpful for you to look at existing scholarly articles that deal with quantitative research (like ones in your literature review) to see the structure applied. Also keep in mind that your instructor may want the sections broken out slightly differently; nonetheless, the content I outline below should be in your research report.

Introduction and literature review

These are what you’re working on building with your research proposal this semester. They should be included as part of your research report so that readers have enough information to evaluate your research for themselves. What’s here should be very similar to the introduction and literature review from your research proposal, where you described the literature relevant to the study you wanted to do. With your results in hand, though, you may find that you have to add information to the literature you wrote previously to help orient the reader of the report to important topics needed to understand the results of your study.

In this section, you should explicitly lay out your study design—for instance, if it was experimental, be specific about the type of experimental design. Discuss the type of sampling that you used, if that’s applicable to your project. You should also go into a general description of your data, including the time period, any exclusions you made from the original data set and the source—i.e., did you collect it yourself or was it secondary data? Next, talk about the specific statistical methods you used, like t- tests, Chi-square tests, or regression analyses. For descriptive statistics, you can be relatively general—you don’t need to say “I looked at means and medians,” for instance. You need to provide enough information here that someone could replicate what you did.

In this section, you should also discuss how you operationalized your variables. What did you mean when you asked about educational attainment—did you ask for a grade number, or did you ask them to pick a range that you turned into a category? This is key information for readers to understand your research. Remember when you were looking for ways to operationalize your variables? Be the kind of author who provides enough information on operationalization so people can actually understand what they did.

You’re going to run lots of different analyses to settle on what finally makes sense to get a result—positive or negative—for your study. For this section, you’re going to provide tables with descriptions of your sample, including, but not limited to, sample size, frequencies of sample characteristics like race and gender, levels of measurement, appropriate measures of central tendency, standard deviations and variances. Here you will also want to focus on the analyses you used to actually draw whatever conclusion you settled on, both descriptive and inferential (i.e., bivariate or multivariate).

The actual statistics you report depend entirely on the kind of statistical analysis you do. For instance, if you’re reporting on a logistic regression, it’s going to look a little different than reporting on an ANOVA. In the previous chapter, we provided links to open textbooks that detail how to conduct quantitative data analysis. You should look at these resources and consult with your research professor to help you determine what is expected in a report about the particular statistical method you used.

The important thing to remember here—as we mentioned above—is that you need to be totally transparent about your results, even and especially if they don’t support your hypothesis. There is value in a disproved hypothesis, too—you now know something about how the state of the world is not .

In this section, you’re going to connect your statistical results back to your hypothesis and discuss whether your results support your hypothesis or not. You are also going to talk about what the results mean for the larger field of study of which your research is a part, the implications of your findings if you’re evaluating some kind of intervention, and how your research relates to what is already out there in this field. When your research doesn’t pan out the way you expect, if you’re able to make some educated guesses as to why this might be (supported by literature if possible, but practice wisdom works too), share those as well.

Let’s take a minute to talk about what happens when your findings disprove your hypothesis or actually indicate something negative about the group you are studying. The discussion section is where you can contextualize “negative” findings. For example, say you conducted a study that indicated that a certain group is more likely to commit violent crime. Here, you have an opportunity to talk about why this might be the case outside of their membership in that group, and how membership in that group does not automatically mean someone will commit a violent crime. You can present mitigating factors, like a history of personal and community trauma. It’s extremely important to provide this relevant context so that your results are more difficult to use against a group you are studying in a way that doesn’t reflect your actual findings.

Limitations

In this section, you’re going to critique your own study. What are the advantages, disadvantages, and trade-offs of what you did to define and analyze your variables? Some questions you might consider include: What limits the study’s applicability to the population at large? Were there trade-offs you had to make between rigor and available data? Did the statistical analyses you used mean that you could only get certain types of results? What would have made the study more widely applicable or more useful for a certain group? You should be thinking about this throughout the analysis process so you can properly contextualize your results.

In this section, you may also consider discussing any threats to internal validity that you identified and whether you think you can generalize your research. Finally, if you used any measurement tools that haven’t been validated yet, discuss how this could have affected your results.

Significance and conclusions

Finally, you want to use the conclusions section to bring it full circle for your reader—why did this research matter? Talk about how it contributed to knowledge around the topic and how might it be used to further practice. Identify and discuss ethical implications of your findings for social workers and social work research. Finally, make sure to talk about the next steps for you, other researchers, or policy-makers based on your research findings.

  • Your quantitative research report should provide the reader with transparent, replicable methods and put your research into the context of existing literature, real-world practice and social work ethics.
  • Think about the research project you are building now. What could a negative finding be, and how might you provide your reader with context to ensure that you are not harming your study population?

process by which researchers spell out precisely how a concept will be measured in their study

Ability to say that one variable "causes" something to happen to another variable. Very important to assess when thinking about studies that examine causation such as experimental or quasi-experimental designs.

Graduate research methods in social work by Matthew DeCarlo, Cory Cummings, Kate Agnelli is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

  • Advertising
  • Applications
  • Assessments
  • Certificates
  • Announcement
  • Invitations
  • Newsletters
  • Questionnaires
  • Food & Beverages
  • Recruitment
  • Marketing Examples
  • Transportation

9+ Quantitative Research Report Templates [Download Now]

chart close up data desk

Quantitative Research Report Templates

9+ quantitative research report templates, 1. business research report template, 2. market research report template, 3. scientific report template, 4. lab report template, 5. chemistry lab report template, 6. physics lab report template, 7. scientific report template, 8. restaurant swot analysis template, 9. free research report cover page template, 10. outline for quantitative research paper sample.

business research report 4

  • Google Docs

market research report 21

More Business

79+ formal letter examples & samples in pdf | doc | microsoft word | apple pages | google docs, 28+ examples of sports certificate in publisher | ms word | psd | ai | pages | indesign, 65+ project proposal examples in pdf | ms word | pages | google docs, 8+ reminder email examples & samples in pdf | doc, 85+ report examples in pdf, 7+ formal email examples and samples in pdf | doc, 27+ email examples & samples in microsoft word | apple pages | editable pdf | google docs, 10+ goodbye emails to coworkers examples & samples in word, 17+ leave application email examples & samples in pdf | doc, 58+ incident report examples & samples in pdf | google docs | pages | doc, 4+ introduction email examples & samples – pdf, doc, 38+ business proposal letter examples in pdf | doc | microsoft word | apple pages.

up_arrow

Related Articles

  • 15+ Market Research Proposal Examples – PDF, Word
  • 19+ Report Writing Examples

Research Rockstar Training Portal

Writing Quantitative Research Reports

What makes for a great quantitative research report in this class taught by kathryn korostoff, you learn how to write a great quantitative market research report—even if you are new to report writing—in a fun practical way., here's what the course includes..., course overview.

Learn how to write a great quantitative market research report, even if you are new to report writing.

What makes for a great quantitative research report? It needs to synthesize and present survey research findings in a way that your audience will find useful and memorable. You want your audience to understand and retain key research findings and to maximize the chance they will put them to use.

The options for interpreting, synthesizing and reporting quantitative data are taught here in a fun, practical way.

In this 4-part program, Instructor Kathryn Korostoff teaches students how to interpret quantitative data in order to address project objectives, and how to report the findings using various text, visual display and even multimedia approaches.

  • This class includes homework assignments, and students should be prepared to spend one hour per week on homework.
  • This class does not include how to conduct Quantitative Data Analysis; that is a separate class titled, “Quantitative Data Analysis for Survey Research.”
  • This class teaches a PowerPoint approach to reporting.
  • Prerequisites: At least two years of market research professional experience OR completion of Market Research 101.

Upon course completion, you will be able to:

  • Craft a plan that will mitigate the risk of schedule slips
  • Optimize reporting formats for different audiences
  • Synthesize data efficiently
  • Develop key findings that will inspire insights
  • Deliver bad news constructively

As a student, you get lots of helpful videos, readings and reference material.

writing a report quantitative

Take a look inside the course

Course curriculum.

Getting Ready to Rock!

Access Your Live Sessions Here

Pre-assignment

Optional Pre-assignment [Slide File Download]

View the pre-assignment slides online

Know Your Audience, Planning for Impact

Lesson Plan

Know Your Audience, Planning for Impact [Lecture]

Know Your Audience, Planning for Impact [Slide Viewer]

Know Your Audience, Planning for Impact [Slide File Download]

Rockstar Practice: Comparing Two Report Styles

Research Report Example A

Research Report Example B

Know Your Audience [Downloadable Job Aid]

Part 1 Quiz: What Makes a PowerPoint Report “Modern” versus "Dated"?

Writing Tips: Style, Voice, Powerful Words

Bonus: Free Rockstar Report Template (PPT)

Market Research Report Outline and Tips [Downloadable Job Aid]

Reading & Summarizing Data

Reading & Summarizing Data [Lecture]

Reading & Summarizing Data [Slide Viewer]

Reading & Summarizing Data [Slide File Download]

Scope & Methodology Checklist [Downloadable Job Aid]

Rockstar Practice: Summarizing Survey Data

Management Summaries That Ignite Insights

Management Summaries That Ignite Insights [Lecture]

Management Summaries That Ignite Insights [Slide Viewer]

Management Summaries That Ignite Insights [Slide File Download]

Final Steps That Boost Credibility & Impact

Final Steps That Boost Credibility & Impact [Lecture]

Final Steps That Boost Credibility & Impact [Slide Viewer]

Final Steps That Boost Credibility & Impact [Slide File Download]

Planning Your Post-Mortem Process [Downloadable Job Aid]

Bonus Resources

Recommended Reading & Viewing

Quantitative Research Report Examples

4 Survey Report Samples [Mini-Lesson]

Scope & Methodology [Mini-Lesson]

Your Rockstar Status Awaits: Final Assessment

Our Short Feedback Survey

Your candid feedback helps us improve

Meet Your Instructor

writing a report quantitative

Kathryn Korostoff

President, lead instructor research rockstar training and staffing, frequently asked questions, we are committed to advancing the work, and careers, of market research & insights professionals.

Many Research Rocksktar students enjoy the schedule flexibility of our on-demand training format. That's great! But for those seeking an interactive experience with real-time instructor demonstrations and support, nothing beats our real-time sessions! All Research Rockstar courses are offered twice a year in real-time. And 24/7 in the on-demand format.

Most of our students are working professionals with busy schedules. Sometimes they intend to complete training more quickly than the reality of looming project deadlines allow. With a membership, students have access to all of their classes. No time-outs!

Yes. We offer free previews of 5 courses. Just select the course —or courses—of interest. Be sure to watch a lecture, read a case study or even try a sample quiz.

Teams of 5 or more students are offered team pricing. For more information, please see our Team Pricing page.

Ready to Rock?

Get access to this course and 25+ market research courses with a backstage pass membership..

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

Improving quantitative writing one sentence at a time

Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Supervision, Validation, Writing – original draft, Writing – review & editing

* E-mail: [email protected]

Affiliation Biology Department, Santa Clara University, Santa Clara, California, United States of America

ORCID logo

Roles Formal analysis, Writing – original draft

Roles Data curation, Funding acquisition, Validation, Writing – review & editing

  • Tracy Ruscetti, 
  • Katherine Krueger, 
  • Christelle Sabatier

PLOS

  • Published: September 12, 2018
  • https://doi.org/10.1371/journal.pone.0203109
  • Reader Comments

Fig 1

Scientific writing, particularly quantitative writing, is difficult to master. To help undergraduate students write more clearly about data, we sought to deconstruct writing into discrete, specific elements. We focused on statements typically used to describe data found in the results sections of research articles (quantitative comparative statements, QC). In this paper, we define the essential components of a QC statement and the rules that govern those components. Clearly defined rules allowed us to quantify writing quality of QC statements (4C scoring). Using 4C scoring, we measured student writing gains in a post-test at the end of the term compared to a pre-test (37% improvement). In addition to overall score, 4C scoring provided insight into common writing mistakes by measuring presence/absence of each essential component. Student writing quality in lab reports improved when they practiced writing isolated QC statements. Although we observed a significant increase in writing quality in lab reports describing a simple experiment, we noted a decrease in writing quality when the complexity of the experimental system increased. Our data suggest a negative correlation of writing quality with complexity. We discuss how our data aligns with existing cognitive theories of writing and how science instructors might improve the scientific writing of their students.

Citation: Ruscetti T, Krueger K, Sabatier C (2018) Improving quantitative writing one sentence at a time. PLoS ONE 13(9): e0203109. https://doi.org/10.1371/journal.pone.0203109

Editor: Mitchell Rabinowitz, Fordham University, UNITED STATES

Received: August 26, 2017; Accepted: August 15, 2018; Published: September 12, 2018

Copyright: © 2018 Ruscetti et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: All relevant data are within the paper and its Supporting Information files.

Funding: The authors received financial support from Santa Clara University through the Faculty Development Office (T.R.) and the Office of Assessment (T.R. and C.S.).

Competing interests: The authors have declared that no competing interests exist.

Introduction

Written communication of data is at the core of scholarly discourse among scientists and is an important learning goal for science students in undergraduate education [ 1 ]. For scientists, the currency of scientific dialogue is the research article, which presents essential information required to convince an audience that data are compelling, findings are relevant, and interpretations are valid [ 2 , 3 ]. Writing lab reports that contain the elements of a research article is a widely used method to help students develop critical thinking and quantitative reasoning skills. In our introductory, lab-intensive Cell and Molecular Biology course, we focus on helping students develop the “results” section of their lab report. Students integrate tables, graphs, and text to present and interpret data they have generated in the laboratory. In the text portion, students cannot simply restate previously learned information (“knowledge telling;” [ 4 , 5 ]) or narrate through the data presented visually. Rather, students must mimic the actions of professional researchers by transforming data into knowledge and structuring their arguments to support specific claims/conclusions. This type of inquiry-based writing encourages active participation in the scientific process, enhancing engagement and learning [ 6 , 7 ].

While science instructors recognize the importance of writing in their courses, many do not provide explicit writing instruction [ 8 ]. Instructors may fear that teaching writing skills diverts time from teaching required science concepts, expect that writing is covered in composition courses, or lack the tools and resources to teach writing [ 8 , 9 , 10 ]. We wanted to support writing in our course without diverting focus from the conceptual and discipline-specific content of the course. We examined available writing resources (e.g., books, websites) and found substantial resources regarding the macro structure of the report (e.g., describing the sections and broad organization of the lab reports, [ 11 , 12 ]. We also found resources for sentence level support related to emphasis and voice [ 13 ]. However, these resources do not give students explicit guidance as to how to write about quantitative information. Thus, it is not surprising that many students struggle to both construct appropriate quantitative evidence statements and express them in writing [ 14 ].

There are, however, a few important resources that explore the structure of writing about quantitative information. Each describe comparisons as a primary mode of providing quantitative evidence, (e.g., The lifespan of cells grown in the presence of drug is 25% shorter than the lifespan of control cells .). In her book about writing about numbers, Miller discusses “quantitative comparisons” as a fundamental skill in quantitative writing [ 15 ]. Jessica Polito states that many disciplines use comparisons as the basis of quantitative evidence statements that support conclusions [ 14 ], and Grawe uses the presence of a comparison as a measure of sophisticated quantitative writing [ 16 ]. We focused on these types of comparative evidence statements and called them Quantitative Comparative statements (QC). We found this type of statement was commonly used to describe data in the scientific literature, and we decided to emphasize the correct construction of these statements in student writing.

We analyzed over a thousand QC statements from student and professional scientific writing to discover the critical elements of a QC statement and the rules that govern those elements. We found that a QC statement needs to have a comparison, a quantitative relational phrase, and at least one contextual element. These essential elements of the QC statement can be thought of as sentence-level syntax. We then developed a metric to measure writing syntax of the QC statement and by proxy, quantitative writing quality. We examined the effectiveness of different approaches to support writing in a course setting and show that practice writing QC statements with feedback can improve student writing. We also investigated how the circumstances of the writing assignment can change the quality of quantitative writing. Together, these data provide insight into how we might improve undergraduate science writing instruction and the clarity of scientific writing.

Methods and materials

Student population and course structure.

We collected data at Santa Clara University (SCU), a private liberal arts university that is a primarily undergraduate institution. Participants were recruited from BIOL25 –Investigations in Cell and Molecular Biology, a lower-division biology course. Prerequisites include a quarter of introductory physiology, a year (3 quarters) of general chemistry and one quarter of organic chemistry. BIOL25 consists of three interactive lecture periods (65 minutes) and one laboratory period (165 minutes) per week. The lecture periods focus on preparing for the laboratory experience, analysis, interpretation, and presentation of data. Laboratory sessions focus on data collection, data analysis and peer feedback activities. During the 10-week quarter, two experimental modules (Enzyme Kinetics and Transcription Regulation) culminate in a lab report. Students organize and communicate their analyzed data in tables and graphs and communicate their conclusions and reasoning in written form. We provide a detailed rubric for the lab reports and a set of explicit instructions for each lab report ( S2 Fig ). In addition, students participate in peer feedback activities with an opportunity to revise prior to submission.

The basic structure of the course was unchanged between 2014 and 2016. The students were distributed among two lecture sections taught by the same instructors and 13 laboratory sections led by 5 different instructors. All students included in this study signed an informed consent form (213 of 214). This study was reviewed and approved by the Santa Clara University Institutional Review Board (project #15-09-700).

Instructional support

General writing feedback (2014–2016)..

In all iterations of the course discussed in this article, students received general writing feedback after each lab report. In each lab report, students wrote paragraphs in response to prompting questions regarding the data. Writing feedback was holistic and included phrases such as “not quantitative”, or “inappropriate comparison,” but was not specific to any type of sentence.

Calculation support (2015–2016).

In 2015 and 2016, students were explicitly introduced to strategies for quantifying relational differences between data points such as percent difference and fold change. Students were given opportunities to practice calculating these values during in class activities prior to writing their lab reports. We stressed that phrases such as more than, drastically higher, and vanishingly small were not quantitative.

Explicit QC statement writing support (2016).

In 2016, we introduced and practiced using quantitative comparative statement as the means to communicate quantitative results. In class, we discussed including an explicit comparison of two conditions and the quantitative relationship between them. Before each lab report, we asked students to write quantitative comparative statements related to the data. We provided formative feedback on the accuracy of the statement and general feedback such as, “not quantitative”, or “inappropriate comparison”. Students in this study were never exposed to the concept of 4C annotation or scoring. We used the scoring strategy exclusively to measure their writing progress.

Identification of quantitative comparative statements (QC)

Quantitative comparative statements are a subset of evidence statements. In native writing (scientific articles or student lab reports), we identified QC statements by the presence of 1) a relational preposition (between, among, etc.), or 2) prepositional phrase ("compared to", "faster/slower than", etc.), 3) a statistical reference (p value), or 4) the presence of quantified change (3 fold, 10% different).

Syntactic elements of QC statements

We examined a corpus of over 1000 QC statements to identify and characterize the essential elements of a QC statement and the rules that govern those elements. Quantitative comparative statements generally take the form of “ The activity of the enzyme is 30% higher in condition X compared to condition Y ”. We identified three critical elements of the quantitative comparative statement: the things being compared (Comparison, condition X and condition Y ), the quantitative relationship between those conditions (Calculation, 30% higher ), and the measurement that gave rise to the compared values (Context, enzyme activity ). Finally, all three elements must be in the same sentence with no redundancy or contradiction (Clarity). These rules are collectively called “4C”.

Syntactic rules for quantitative comparative statements

The Calculation must quantify the relationship between the two compared elements and include both magnitude and direction. Fold change or percent difference are common methods of describing quantitative relationships [ 15 ]. Using absolute or raw values are not sufficient to describe the relationship between the compared elements and are not sufficient. If there is no significant difference between the compared elements, then statistical data must be cited. Context provides additional information about the measurement from which the quantitative comparison was derived, such as growth rate, enzyme activity, etc., or the time at which the comparison was made. The context should be the same for both of the compared elements. Comparisons are usually between like elements (e.g. time vs. time, condition vs. condition) and there should be two and only two in a single sentence. Both compared elements must be explicitly stated so that the reader is not guessing the intended comparison of the writer. A QC statement has Clarity when all three elements are present and in the same sentence. We consider a statement to be “unclear” if it contains inconsistencies or redundancies.

Annotation and scoring of QC statements

We use “annotation” to describe the visual marking of the critical elements of the quantitative comparative statement. We use “scoring” to mean the assignment of a score to a quantitative comparative statement. 4C annotation and 4C scoring do not reflect whether the statement or any of its components are correct, but rather they highlight the syntactic structure of the quantitative comparative statement ( Fig 1 ).

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

(A) Original quantitative comparative statement. (B) Identify and box the relational phrase with both magnitude and direction. (C) Circle what the relational phrase refers to (context). (D) Underline the comparison. (E) Fully 4C annotated quantitative comparative statement.

https://doi.org/10.1371/journal.pone.0203109.g001

Annotation process.

We scanned the results sections of published primary journal articles or student lab reports for relational phrases such as faster than, increased, more than, lower than, etc., and drew a box around the relational phrase , or calculation ( Fig 1B ). If the calculation is an absolute value, a raw value, refers to no particular value, or is missing the magnitude or direction, we would strike through the box. Context . Once the relational phrase, or calculation, was identified, we drew a circle around the information, or context , referred to by the relational phrase ( Fig 1C ). Comparison . The relational phrase and the context helped us identify the comparison and we underlined the compared elements ( Fig 1D ).

4C scoring strategy.

To score an annotated statement, a “1” or a “0” is given to each of the three critical components of the quantitative comparative statement. If all the elements are present in a single sentence, there are no redundancies or inconsistencies, a fourth “1” is awarded for clarity. We call this annotation and scoring strategy “4C” to reflect each of the three critical components and the overall clarity of the statement ( Table 1 ).

thumbnail

https://doi.org/10.1371/journal.pone.0203109.t001

Student writing samples

Pre-test/post-test..

In 2016, student writing was assessed using identical pre- and post-tests. The pre-test was administered on the first day of class prior to any writing support. The post-test was administered as part of the final exam. The pre/post assessment consisted of a graph and data table ( S1 Fig ). The prompts asked the students to analyze the data to answer a specific question related to the data and to use quantitative comparative statements.

Student sampling for lab report analysis.

For the lab reports in 2016, we sampled 40 students from a stratified student population (based on overall grade in the course) and 4C scored all of their quantitative comparative statements in each lab report. On average, students wrote 5–6 quantitative comparative statements per results section for a total of over 200 4C scored statements for each lab report. We scored over 100 statements from 17–20 lab reports in 2014 and 2015.

Complexity index

We based complexity on the number of values (data points) students would have to parse to develop a QC statement. The complexity of a given experiment is in part determined by number of conditions tested in an experiment and the different types of measurements used. For example, in lab report #1 (Enzyme Kinetics) students consider 3 experimental conditions (control and two separate variables) and 2 measurements (K m and V max ). Thus we calculated a complexity index of 6 (3 conditions x 2 measurements) for lab report #1. In this measure of complexity index, we assumed that all parameters contributed equally to the complexity of the experiment, and that all parameters were equally likely to be considered by students as they developed their written conclusions. However, by designing specific writing prompts, we could guide students to examine a smaller subset of data points and reduce complexity of the situation. In lab report #1 for example, we can prompt students to consider only the effect of the treatment on a single variable such that they only consider 2 conditions (the control and the single experimental variable described in the prompt) and 2 measurements. Now, students are focused on a subset of data and the complexity of the situation could be considered “4”.

Quantitative comparative statements are universally used to describe data

Having decided to focus on QC statements in student writing, we first wanted to quantify their occurrence in professional writing. We examined the results sections in all the research articles from three issues of pan-scientific journals: Science, Nature, PLOS-One, and PNAS. We identified an average of 7–15 QC statements in each research article, with no significant difference in the mean number of QC statements among the different journals ( Fig 2 , ANOVA, p = 0.194). There was also no difference of the number of QC statements among the different disciplines (Kruskal-Wallis, p = 0.302). Out of the 60 articles examined, we found only one article that did not have a single QC statement to describe the data ( Fig 2 , Nature). These data suggest that QC statements are used in professional forms of quantitative writing to describe data in many different disciplines.

thumbnail

The mean (middle vertical line) ± SD are shown. Physical science papers are denoted in red, Biological sciences are in blue, and Social sciences are in green.

https://doi.org/10.1371/journal.pone.0203109.g002

4C scoring used to measure quantitative writing

In 2016, students practiced writing QC statements related to their data and we provided feedback (see Methods ). We measured the effectiveness of the focused writing practice using 4C scoring of QC statements from a pre- and post-test (see Methods and Table 1 ). We observed a 37% increase in student 4C scores on the post-test assessment compared to the pre-test (p < 0.001, Fig 3A ). In addition, we used 4C scoring to interrogate the impact of the writing intervention on each of the required components of the QC statement ( Fig 3B ). We observed improvements in each of the components of QC statements ( Fig 3C ). In the post-test, over 80% of students included a calculation (magnitude and direction), referred explicitly to both items being compared, and referenced the measurement context for their comparison. Only 25% of students produced completely clear statements, meaning that they were not missing any elements, and did not contain redundant or contradictory phrases. Despite the low post-test clarity score, we observed a 40% improvement in students writing completely clear statements in the post-test compared to the pre-test score ( Fig 3C ).

thumbnail

( A) Mean 4C scores of quantitative comparative statements on an identical pre- and post- test. (B) Percent of statements that contain each of the essential components of a QC statement. (C) Percent difference between the pre-test and post-test broken down by essential components of QC statements. (***t-test, p < 0.001) Error bars in A represent Standard Error of the Mean (SEM).

https://doi.org/10.1371/journal.pone.0203109.g003

We next asked if we could measure student learning gains in quantitative writing within the context of a lab report. Students write 2 lab reports per term and we provided varying forms of writing feedback over several iterations of the course (see Methods ). We scored QC statements in two lab reports from 2014 (general writing feedback only), 2015 (general writing feedback and calculation support) and 2016 (general writing feedback, calculation support, and sentence-level writing practice) ( Fig 4A ). There was no appreciable impact on writing quality when we added calculation support to general feedback in 2015 compared to feedback alone in 2014 (t test, p = 0.55, Fig 4A ). However, the addition of sentence-level QC writing support in 2016 resulted in a 22% increase in student mean 4C scores on lab report #1 compared to the same report in 2015 ( Fig 4A , t test, p < 0.05). We noticed the same trends in lab report #2 ( Fig 4B ): general writing feedback and calculation support did not improve scores as compared to general feedback alone (t test, p = 0.88). However, we observed an 80% increase in 4C scores on lab report #2 when we provided sentence-level writing practice compared to feedback alone ( Fig 4B , t test, p < 0.001). The mean 4C scores in each year for each assessment, as well as the forms of writing support employed, are summarized in Table 2 . Overall, these data suggest that sentence-level writing practice with feedback is important in helping students improve the syntax of quantitative writing.

thumbnail

(A) Mean 4C scores of QC statements from lab reports (enzyme kinetics). (B) Mean 4C scores of QC statements from second lab reports (transcriptional regulation). (C) Percent difference between the two lab reports within a given year, broken down by essential components (*p < 0.05, ***p < 0.001) Error bars in A and B represent SEM.

https://doi.org/10.1371/journal.pone.0203109.g004

thumbnail

https://doi.org/10.1371/journal.pone.0203109.t002

We were surprised to find that although the trends in the data were similar between the two lab reports, the mean 4C scores of QC statements in lab report #2 were 40% lower than in lab report #1 in both 2014 and 2015 (t test, p < 0.0001, Fig 4A versus 4B ). We predicted that writing skills would either improve with focused practice, or not change over the course of the quarter. To understand which components of the quantitative comparative statement were differentially impacted in the two lab reports, we calculated the relative frequency with which each component was included in a QC statement. Then, we calculated the difference of those frequencies between the first and second lab report for each year ( Fig 4C ). A column below the x-axis indicates that students made particular mistakes more often in lab report #2 ( Fig 4C ). In 2014, students were able to make comparisons equally well between both lab reports, but students struggled to include a quantitative difference or provide context in their evidence statements ( Fig 4C ). In 2015, in addition to general writing feedback, we also provided instructional support to calculate relative differences. We noted that students were able to incorporate both comparisons and calculations into their QC statements in both reports. However, they often omitted the context ( Fig 4C ). The frequency of mistakes made by students is significantly different between lab report #1 and lab report #2 (Chi squared, p < 0.001). These data suggest that feedback alone is not sufficient to improve quantitative writing. In 2016, we provided targeted practice at the sentence level and observed no significant difference in mean 4C scores between the two lab reports ( Fig 4B , t test, p = 0.0596), suggesting that the writing skills of students did not decrease from one lab report to the next. Additionally, students included the four elements of the QC statement equally well between the two lab reports (Chi squared, p = 0.6530, Fig 4C , 2016). Thus, when students receive targeted, sentence-level writing practice, their ability to write QC statements improves.

Quantitative writing quality is negatively impacted by complexity

We were perplexed as to why quantitative writing syntax (as measured by mean 4C scores) declined in lab report #2 compared to lab report #1 in both 2014 and 2015 ( Fig 4A and 4B ). Because we view the essential components of QC statements as analogous to syntactic rules that govern writing of QC statements, we can apply principles and theories that govern writing skills writ large. Research from writing in English Composition shows that writing ability, as measured by sentence level syntax, deteriorates when the writer is struggling with basic comprehension [ 17 , 18 ]. We hypothesized that students’ ability to write about data also might be negatively impacted when students struggled to comprehend the conceptual system they were asked to interrogate. However, we found no correlation between mean 4C scores and any assessment of conceptual material (data not shown). Nor was there an association between mean 4C scores on the lab reports and the related sections of the final (data not shown). Together, these data suggest that conceptual comprehension does not impact writing of a QC statement.

In addition to conceptual understanding, QC statements require that the writer parse through the data set to select the relevant data points to interrogate. We hypothesized that the number of data points (values) in the data set may negatively impact QC statement syntax. We calculated the complexity of different assignments (see methods ) and plotted mean 4C scores as a function of complexity index. We performed linear regression analysis on those mean 4C scores from writing samples occurring prior to formal writing intervention (2014 and 2015 lab reports, and the 2016 pre-test, Fig 5A , closed circles) and those that occur after specific writing intervention (2016 lab reports and 2016 post-test, Fig 5A , open circles). There is a strong inverse correlation between writing as measured by mean 4C scores and complexity (r 2 = 0.9471 for supported and r 2 = 0.9644 for unsupported writing, Fig 5A ). Moreover, the slopes of the lines generated from the regression analysis of mean 4C scores do not vary significantly despite writing interventions (p = 0.3449). Although the task complexity in 2016 was reduced relative to 2015, the negative impact of complexity on writing persisted. Thus, as the complexity of experimental data sets increases, the ability to write clearly decreases regardless of the writing intervention.

thumbnail

(A) Writing syntax as a function of complexity measured by 4C scoring and reported as either unsupported (closed circles) or supported (open circles) by instructional intervention. Linear regression lines are shown (unsupported, R 2 = 0.9644, supported R 2 = 0.9471). (B) Students were stratified based on overall performance in the course. Statements from students within the group were averaged and reported. Error bars represent SEM.

https://doi.org/10.1371/journal.pone.0203109.g005

Complexity differentially impacts specific populations of students

Part of the developmental process of analytical reasoning is parsing relevant from irrelevant data [ 1 ]. We asked if subpopulations of our students were more capable of parsing information from larger data sets than others. We stratified 2016 students into quartiles based on overall performance in the course. We measured the mean 4C scores from the post-test and both lab reports, and plotted mean 4C score as a function of “constrained” complexity ( Fig 5B ). At lower complexity levels, there is no significant difference between the highest performing students and the lowest performing students (t test, p >0.05). Increasing complexity also had a negative impact on most of our students. However, students in the top quartile were less affected by increased complexity than the lower 75% of the class (t test, p <0.05, Fig 5B ). These data suggest there are students who are developmentally capable of controlling the complexity of the task to focus on the skill of writing.

We set out to help STEM students write more clearly and we focused on writing a specific but universal form of evidence statement, the quantitative comparative statement ([ 14 , 15 ], Fig 2 ). By analyzing text from student lab reports and professional scientific articles, we defined the syntax of quantitative comparative statements ( Fig 1 , Table 1 ). Based on the syntactic rules we established, we scored individual quantitative comparative statements and measured writing quality (Figs 3 – 5 ). Our data show that writing quality (measured by 4C scoring) can be improved with focused practice and feedback (Figs 3 and 4 ). Finally, our data show that the circumstance, i.e., the complexity of the writing task, influences writing quality. For example, writing quality decreased when students interrogated larger data sets (Figs 4 and 5 ), but was improved when students were directed by the writing prompt to focus on a subset of the data ( Fig 5 and data not shown).

Our findings are consistent with previous research in Writing Studies and English Composition showing that syntax suffers when writers are confronted with complex and unfamiliar conceptual material [ 17 , 18 , 19 ]. The Cognitive Process Theory of Writing states that writing is a cognitive endeavor and that three main cognitive activities impact writing, the process of writing (syntax, grammar, spelling, organization, etc.), the task environment (the purpose of the writing task), and knowledge of the writing topic [ 17 , 18 , 19 ]. The theory posits that cognitive overload in any of these areas will negatively impact writing quality [ 17 , 18 ]. Consistent with the theory, our data show that writing quality is a function of explicit writing practice ( Fig 3 ), the size of the data set ( Fig 4A compared to 4B ) and scope of the writing prompts ( Fig 4B 2015 compared to 2016).

Explicit sentence level practice improves writing quality

Our data suggest that practicing isolated sentence construction improves writing quality (Figs 3 and 4 ). In every year of this study, we provided students with generalized feedback about their quantitative comparative statements (e.g., “needs quantitation” or “needs a comparison”) within the context of their lab report. In 2016, students practiced writing a QC statement related to their data but separate from the lab report. Although our feedback was the same, we observed improvement only when the feedback was given to QC statements practiced out of the lab report context ( Fig 4A compared to 4B ). Consistent with our data, the Cognitive Process Theory of Writing predicts that practicing specific syntax will increase fluency, lower the cognitive load on the writer’s working memory, and improve writing [ 17 , 18 ]. Our data are also consistent with research in English Composition demonstrating that when instructors support sentence-level syntax, they observe improved sentence level construction, improved overall composition, and higher level critical thinking [ 20 ]. In addition to improved sentence level syntax, we also observed overall quality of lab reports improved 12% in 2016 compared to the same lab report in 2015 (based on rubric scores, data not shown). If students develop a greater facility with the process of writing by practicing sentence level syntax, they have more cognitive resources available to develop and communicate their reasoning (our data, [ 20 , 21 ]).

Complexity of the writing task affects writing quality

We defined the complexity of the writing assignment as the landscape of information students must sample to interpret and communicate their data. In the case of lab reports, that information is the collected and analyzed data set ( Table 2 ). Students interrogating a larger data set produced lower quality QC statements than when they interrogated a smaller data set (compare lab report #2 to lab report #1 in both 2014 and 2015 cohorts, Fig 4 ). In lab report #2, students not only contended with a larger number of values in the dataset compared to lab report #1, but also with two different measurements. These data are consistent with the Cognitive Process Theory of Writing that suggests that when demands on the writer’s knowledge of the topic increase, the writer cannot devote as many cognitive resources to the task environment or process of writing [ 17 , 18 ]. However, we observed that the negative effect of experimental complexity on writing quality can be mitigated by writing prompts that focus students on a smaller, specific subset of the data ( Fig 5A ). More focused writing prompts and smaller data sets reduce the task environment of the assignment and allow more cognitive load to be devoted to the process of writing.

Model for writing quality as a function of complexity

Interestingly, the writing quality of students who finished the course with higher final grades (top quartile) was more resistant to increases in complexity compared to their classmates ( Fig 5B ). These data are consistent with the ideas of McCutchen who posits that as writers become more expert in their field, they have more cognitive resources to devote to clear communication. McCutchen suggests that expert writers have 1) more knowledge of their discipline, 2) more familiarity with the genres of science writing (task environment), and 3) more practice with the process of writing [ 19 ]. Based on research in Writing Studies, the Cognitive Process Theory of Writing, and the data presented here, we developed a predictive model of the impact of complexity (cognitive load) on writing quality ( Fig 6 ). We have hypothesized a linear model in which any increase in complexity negatively impacts writing quality ( Fig 6A ) and a “breakpoint” model in which writers maintain a constant level of writing quality at lower complexity levels writing quality but decline at higher levels of complexity ( Fig 6B ). We hypothesize that our top performing students have moved into a more expert space in the model by developing strategies to parse a complex task environment and ignore irrelevant information. Effectively, these skills allow them to minimize the impact of complexity on their cognitive load and maintain their writing quality even in the face of complex data sets ( Fig 5B ).

thumbnail

(A) Simple linear model of the relationship between writing quality and complexity (cognitive load). (B) Model of the relationship between writing quality and complexity in which low complexity has minimal impact on writing quality but higher complexity negatively impacts writing quality.

https://doi.org/10.1371/journal.pone.0203109.g006

4C instruction as a writing intervention

In addition to altering the writing assignment to decrease cognitive load on the students, we also think it will be important to provide students with syntactic structures at the sentence level. In this study, we did not use 4C annotation as an instructional intervention so that 4C scoring would be a more objective measure of writing quality. But, subsequent to this study, we and others have used 4C annotation as an instructional tool and found that student writing improves dramatically (data not shown). Although some argue that using overly structured or templated sentences can stifle creativity, providing basic structure does not necessarily lead to pedantic writing [ 22 ]. A commonly used text in college writing, “They say, I say,” determined that providing templates for constructing opinions and arguments gives students a greater ability to express their thoughts [ 23 ]. Specifically, weaker writers who lack intuitive understanding of how to employ these writing structures benefit from the use of explicit templates, while more advanced writers already employ these writing structures in a fluid and nuanced manner [ 23 ].

4C template as a foundation of quantitative writing

As students become more expert writers and write more complex and sophisticated sentences, they may choose to deviate from the proscribed sentence structure and make editorial decisions about the elements of the quantitative comparison in the context of their argument [ 23 ]. In fact, when we examined the 4C scores of quantitative comparative statements in published literature, we found that, on average, professional scientists write comparisons that are missing one of the three elements (4C score = 1.89 +/- 0.05, n = 281). The expert writer may eliminate an element of the evidence statement because he/she presumes a more sophisticated audience is capable of inferring the missing element from prior knowledge or within the context of the argument. Or, the author may provide all elements of quantitative comparison in their argument but not within a single sentence.

Helping students become expert writers

Based on our research, we think novice writers should write for novice readers and include all of the syntactic elements of a QC statement. As students develop their professional voice, the 4C template will serve as a touchstone to frame their quantitative arguments, and the editorial choices they make will depend on the sophistication of their audience. Students will write clear arguments even if those elements no longer reside within the rigid structure of a single QC statement with a perfect 4C score. We are confident that by supporting student writing at the level of syntax, we are building a solid foundation that will give students greater capacity for reasoning in the face of increasing experimental complexity.

Supporting information

S1 fig. pre test / post test..

Example of the pre- and post-test used to assess the ability to interpret graphical and tabular data and write a quantitative comparative statement.

https://doi.org/10.1371/journal.pone.0203109.s001

S2 Fig. Lab Report Rubric.

A detailed rubric provides students with explicit guidance for each lab report. This rubric corresponds with the experiment exploring enzyme kinetics of β-galactosidase.

https://doi.org/10.1371/journal.pone.0203109.s002

Acknowledgments

The authors thank Dr. Jessica Santangelo for critical feedback on the manuscript and unwavering support for this project. This study was initially developed as part of the Biology Scholars Program (Research Residency) through the American Society for Microbiology and the National Science Foundation (T.R.)

  • 1. American Association for the Advancement of Science. Vision and change in undergraduate biology education: a call to action. Brewer Cand Smith D., Eds. American Association for the Advancement of Science. 2011. 1–100. http://visionandchange.org/files/2013/11/aaas-VISchange-web1113.pdf
  • View Article
  • Google Scholar
  • 3. Bazerman C. Shaping Written Knowledge: The Genre and Activity of the Experimental Article in Science. University of Wisconsin Press; 1988.
  • PubMed/NCBI
  • 15. Miller JE. The Chicago Guide to Writing about Numbers, Second Edition. 2nd ed. Chicago: Chicago University Press; 2015
  • 20. Languis ML, Buffer JJ Martin D, Naour PJ. Cognitive Science: Contributions to Educational Practice Routledge; 2012. 304 p.
  • 23. Graff G, Cathy Birkenstein. They say / I say: the moves that matter in academic writing. New York: W.W. Norton & Co.; 2010.
  • How to Cite
  • Language & Lit
  • Rhyme & Rhythm
  • The Rewrite
  • Search Glass

How to Write a Quantitative Analysis Report

A quantitative analysis can give people the necessary information to make decisions about policy and planning for a program or organization. A good quantitative analysis leaves no questions about the quality of data and the authority of the conclusions. Whether in school completing a project or at the highest levels of government evaluating programs, knowing how to write a quality quantitative analysis is helpful. A quantitative analysis uses hard data, such as survey results, and generally requires the use of computer spreadsheet applications and statistical know-how.

Explain why the report is being written in the introduction. Point out the need that is being filled and describe any prior research that has been conducted in the same field. The introduction should also say what future research should be done to thoroughly answer the questions you set out to research. You should also state for whom the report is being prepared.

Describe the methods used in collecting data for the report. Discuss how the data was collected. If a survey was used to collect data, tell the reader how it was designed. You should let the reader know if a survey pilot test was distributed first. Detail the target population, or the group of people being studied. Provide the sample size, or the number of people surveyed. Tell the reader if the sample was representative of the target population, and explain whether you collected enough surveys. Break down the data by gender, race, age and any other pertinent subcategory. Tell the reader about any problems with data collection, including any biases in the survey, missing results or odd responses from people surveyed.

Create graphs showing visual representations of the results. You can use bar graphs, line graphs or pie charts depending to convey the data. Only write about the pertinent findings, or the ones you think matter most, in the body of the report. Any other results can be attached in the appendices at the end of the report. The raw data, along with copies of a blank survey should be in the appendices as well. The reader can refer to all the data to inform his own opinions about the findings.

Write conclusions after evaluating all the data. The conclusion can include an action item for the reader to accomplish. It can also advise that more research needs to be done before any solid conclusions can be made. Only conclusions that can be made based on the findings should be included in the report.

Write an executive summary to attach at the beginning of the report. Executive summaries are quick one to two page recaps of what is in the report. They include shorter versions of the introductions, methods, findings and conclusions. Executive summaries serve to allow readers to quickly understand what is said in the report.

Things You'll Need

  • Syracuse University: Practicum in Public Policy
  • Georgia Tech: Questionnaire Design

This article was written by the CareerTrend team, copy edited and fact checked through a multi-point auditing system, in efforts to ensure our readers only receive the best information. To submit your questions or ideas, or to simply learn more about CareerTrend, contact us [here](http://careertrend.com/about-us).

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

REPORT WRITING OF QUALITATIVE AND QUANTATIVE RESEARCH

Profile image of Majeed Ali

Quantitative report writing

Related Papers

writing a report quantitative

Richard Baskas, Ed.D. Candidate

Michael Evans

Notchie Angeles

Nicoleta Gabor

Ridwan Osman

Quantitative Research Study Report

Evelyn Mayes

Quantitative research, aligned with the postpositivist world view, has heralded the scientific method, thus, pertaining to promote accuracy, credibility, and validity. However, quantitative methodologies often align with qualitative research, lending it the stamp of scientific proof. McCusker and Gunaydin (2015) examined the advantages of the amalgam of both. A meta-analysis, they claimed, does not belong to the realm of qualitative research alone, but is akin also to quantitative study in that it "enables the acquisition of multiple quantitative findings, followed by merging data and information to create a more representative viewpoint" (p. 3). Hence, in studying quantitative research, elements of qualitative research, such as surveys, emerge, representing a reciprocal relationship between the two.

Guide to Publishing in Psychology Journals

Lawrence Rudner

RELATED PAPERS

Zheng Huang

Muhammad Maqbool

International Journal of P R O F E S S I O N A L Business Review

ACM Inroads

LangLit: An International Peer-Reviewed Open Access Journal

Dr Arvind M Nawale

MD Ashikur Rahman

Zubair Naeem

Eneas Olamendi

Marilyn Freimuth

Research Methodology and Scientific Writing

C George Thomas

Arthroscopy: The Journal of Arthroscopic & Related Surgery

Joseph Nguyen

Velia Nuñez

Md Hasif Sinha

Sandra Mathison

Jorge Muniz Junior

Ikrom Abualiff

Catherine Whitaker

Çeviri/Translation

Güssün Güneş

Ahmed OUARET

Anna Soter , Sean Connors , Lucila T Rudge

Joana Romanowski

Frances Slack

Godwin Agboka

Iffatul Muna

collins wetiatia

yvonne morgan

Seda Khadimally

Ponsian P R O T Ntui

Research Methodology in Knowledge-Based Dynamic Capabilities: The Road Ahead in gaining Organizational Competitiveness

Vaneet Kaur

Moses Adeleke ADEOYE

Olinna Wang

BRIJENDRA GAUTAM

Dolores Frias-Navarro

mohan dissanayaka

Umair Majid

Journal of Environmental Psychology

Chris Fife-Schaw

Esdras Rangel

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2023
  • UNC Libraries
  • HSL Subject Research
  • Qualitative Research Resources
  • Writing Up Your Research

Qualitative Research Resources: Writing Up Your Research

Created by health science librarians.

HSL Logo

  • What is Qualitative Research?
  • Qualitative Research Basics
  • Special Topics
  • Training Opportunities: UNC & Beyond
  • Help at UNC
  • Qualitative Software for Coding/Analysis
  • Software for Audio, Video, Online Surveys
  • Finding Qualitative Studies
  • Assessing Qualitative Research

About this Page

Writing conventions for qualitative research, sample size/sampling:.

  • Integrating Qualitative Research into Systematic Reviews
  • Publishing Qualitative Research
  • Presenting Qualitative Research
  • Qualitative & Libraries: a few gems
  • Data Repositories

Why is this information important?

  • The conventions of good writing and research reporting are different for qualitative and quantitative research.
  • Your article will be more likely to be published if you make sure you follow appropriate conventions in your writing.

On this page you will find the following helpful resources:

  • Articles with information on what journal editors look for in qualitative research articles.
  • Articles and books on the craft of collating qualitative data into a research article.

These articles provide tips on what journal editors look for when they read qualitative research papers for potential publication.  Also see Assessing Qualitative Research tab in this guide for additional information that may be helpful to authors.

Belgrave, L., D. Zablotsky and M.A. Guadagno.(2002). How do we talk to each other? Writing qualitative research for quantitative readers . Qualitative Health Research , 12(10),1427-1439.

Hunt, Brandon. (2011) Publishing Qualitative Research in Counseling Journals . Journal of Counseling and Development 89(3):296-300.

Fetters, Michael and Dawn Freshwater. (2015). Publishing a Methodological Mixed Methods Research Article. Journal of Mixed Methods Research 9(3): 203-213.

Koch, Lynn C., Tricia Niesz, and Henry McCarthy. (2014). Understanding and Reporting Qualitative Research: An Analytic Review and Recommendations for Submitting Authors. Rehabilitation Counseling Bulletin 57(3):131-143.

Morrow, Susan L. (2005) Quality and Trustworthiness in Qualitative Research in Counseling Psychology ; Journal of Counseling Psychology 52(2):250-260.

Oliver, Deborah P. (2011) "Rigor in Qualitative Research." Research on Aging 33(4): 359-360.

Sandelowski, M., & Leeman, J. (2012). Writing usable qualitative health research findings . Qual Health Res, 22(10), 1404-1413.

Schoenberg, Nancy E., Miller, Edward A., and Pruchno, Rachel. (2011) The qualitative portfolio at The Gerontologist : strong and getting stronger. Gerontologist 51(3): 281-284.

Weaver-Hightower, M. B. (2019). How to write qualitative research . [e-book]

Sidhu, Kalwant, Roger Jones, and Fiona Stevenson (2017). Publishing qualitative research in medical journals. Br J Gen Pract ; 67 (658): 229-230. DOI: 10.3399/bjgp17X690821 PMID: 28450340

  • This article is based on a workshop on publishing qualitative studies held at the Society for Academic Primary Care Annual Conference, Dublin, July 2016.

Smith, Mary Lee.(1987) Publishing Qualitative Research. American Educational Research Journal 24(2): 173-183.

Tong, Allison, Sainsbury, Peter, Craig, Jonathan ; Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups , International Journal for Quality in Health Care , Volume 19, Issue 6, 1 December 2007, Pages 349–357, https://doi.org/10.1093/intqhc/mzm042 .

Tracy, Sarah. (2010) Qualitative Quality: Eight 'Big-Tent' Criteria for Excellent Qualitative Research. Qualitative Inquiry 16(10):837-51.

Because reviewers are not always familiar with qualitative methods, they may ask for explanation or justification of your methods when you submit an article. Because different disciplines,different qualitative methods, and different contexts may dictate different approaches to this issue, you may want to consult articles in your field and in target journals for publication.  Additionally, here are some articles that may be helpful in thinking about this issue. 

Bonde, Donna. (2013). Qualitative Interviews: When Enough is Enough . Research by Design.

Guest, Greg, Arwen Bunce, and Laura Johnson. (2006) How Many Interviews are Enough?: An Experiment with Data Saturation and Variability. Field Methods 18(1): 59-82.

Morse, Janice M. (2015) "Data Were Saturated..." Qualitative Health Research 25(5): 587-88 . doi:10.1177/1049732315576699.

Nelson, J. (2016) "Using Conceptual Depth Criteria: Addressing the Challenge of Reaching Saturation in Qualitative Research." Qualitative Research, December. doi:10.1177/1468794116679873.

Patton, Michael Quinn. (2015) "Chapter 5: Designing Qualitative Studies, Module 30 Purposeful Sampling and Case Selection. In Qualitative Research & Evaluation Methods: Integrating Theory and Practice, Fourth edition, pp. 264-72. Thousand Oaks, California: SAGE Publications, Inc. ISBN: 978-1-4129-7212-3

Small, Mario Luis. (2009) 'How Many Cases Do I Need?': On Science and the Logic of Case-Based Selection in Field-Based Research. Ethnography 10(1): 538.

Search the UNC-CH catalog for books about qualitative writing . Selected general books from the catalog are listed below. If you are a researcher at another institution, ask your librarian for assistance locating similar books in your institution's catalog or ordering them via InterLibrary Loan.  

writing a report quantitative

Oft quoted and food for thought

  • Morse, J. M. (1997). " Perfectly healthy, but dead": the myth of inter-rater reliability. DOI:10.1177/104973239700700401 Editorial
  • Silberzahn, R., Uhlmann, E. L., Martin, D. P., Anselmi, P., Aust, F., Awtrey, E., ... & Carlsson, R. (2018). Many analysts, one data set: Making transparent how variations in analytic choices affect results. Advances in Methods and Practices in Psychologi
  • << Previous: Assessing Qualitative Research
  • Next: Integrating Qualitative Research into Systematic Reviews >>
  • Last Updated: Oct 17, 2023 1:03 PM
  • URL: https://guides.lib.unc.edu/qual

Search & Find

  • E-Research by Discipline
  • More Search & Find

Places & Spaces

  • Places to Study
  • Book a Study Room
  • Printers, Scanners, & Computers
  • More Places & Spaces
  • Borrowing & Circulation
  • Request a Title for Purchase
  • Schedule Instruction Session
  • More Services

Support & Guides

  • Course Reserves
  • Research Guides
  • Citing & Writing
  • More Support & Guides
  • Mission Statement
  • Diversity Statement
  • Staff Directory
  • Job Opportunities
  • Give to the Libraries
  • News & Exhibits
  • Reckoning Initiative
  • More About Us

UNC University Libraries Logo

  • Search This Site
  • Privacy Policy
  • Accessibility
  • Give Us Your Feedback
  • 208 Raleigh Street CB #3916
  • Chapel Hill, NC 27515-8890
  • 919-962-1053

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List
  • J Pediatr Psychol

Logo of jpepsy

Commentary: Writing and Evaluating Qualitative Research Reports

Yelena p. wu.

1 Division of Public Health, Department of Family and Preventive Medicine, University of Utah,

2 Cancer Control and Population Sciences, Huntsman Cancer Institute,

Deborah Thompson

3 Department of Pediatrics-Nutrition, USDA/ARS Children’s Nutrition Research Center, Baylor College of Medicine,

Karen J. Aroian

4 College of Nursing, University of Central Florida,

Elizabeth L. McQuaid

5 Department of Psychiatry and Human Behavior, Brown University, and

Janet A. Deatrick

6 School of Nursing, University of Pennsylvania

Objective  To provide an overview of qualitative methods, particularly for reviewers and authors who may be less familiar with qualitative research. Methods  A question and answer format is used to address considerations for writing and evaluating qualitative research. Results and Conclusions  When producing qualitative research, individuals are encouraged to address the qualitative research considerations raised and to explicitly identify the systematic strategies used to ensure rigor in study design and methods, analysis, and presentation of findings. Increasing capacity for review and publication of qualitative research within pediatric psychology will advance the field’s ability to gain a better understanding of the specific needs of pediatric populations, tailor interventions more effectively, and promote optimal health.

The Journal of Pediatric Psychology (JPP) has a long history of emphasizing high-quality, methodologically rigorous research in social and behavioral aspects of children’s health ( Palermo, 2013 , 2014 ). Traditionally, research published in JPP has focused on quantitative methodologies. Qualitative approaches are of interest to pediatric psychologists given the important role of qualitative research in developing new theories ( Kelly & Ganong, 2011 ), illustrating important clinical themes ( Kars, Grypdonck, de Bock, & van Delden, 2015 ), developing new instruments ( Thompson, Bhatt, & Watson, 2013 ), understanding patients’ and families’ perspectives and needs ( Bevans, Gardner, Pajer, Riley, & Forrest, 2013 ; Lyons, Goodwin, McCreanor, & Griffin, 2015 ), and documenting new or rarely examined issues ( Haukeland, Fjermestad, Mossige, & Vatne, 2015 ; Valenzuela et al., 2011 ). Further, these methods are integral to intervention development ( Minges et al., 2015 ; Thompson et al., 2007 ) and understanding intervention outcomes ( de Visser et al., 2015 ; Hess & Straub, 2011 ). For example, when designing an intervention, qualitative research can identify patient and family preferences for and perspectives on desirable intervention characteristics and perceived needs ( Cassidy et al., 2013 ; Hess & Straub, 2011 ; Thompson, 2014 ), which may lead to a more targeted, effective intervention.

Both qualitative and quantitative approaches are concerned with issues such as generalizability of study findings (e.g., to whom the study findings can be applied) and rigor. However, qualitative and quantitative methods have different approaches to these issues. The purpose of qualitative research is to contribute knowledge or understanding by describing phenomenon within certain groups or populations of interest. As such, the purpose of qualitative research is not to provide generalizable findings. Instead, qualitative research has a discovery focus and often uses an iterative approach. Thus, qualitative work is often foundational to future qualitative, quantitative, or mixed-methods studies.

At the time of this writing, three of six current calls for papers for special issues of JPP specifically note that manuscripts incorporating qualitative approaches would be welcomed. Despite apparent openness to broadening JPP’s emphasis beyond its traditional quantitative approach, few published articles have used qualitative methods. For example, of 232 research articles published in JPP from 2012 to 2014 (excluding commentaries and reviews), only five used qualitative methods (2% of articles).

The goal of the current article is to present considerations for writing and evaluating qualitative research within the context of pediatric psychology to provide a framework for writing and reviewing manuscripts reporting qualitative findings. The current article may be especially useful to reviewers and authors who are less familiar with qualitative methods. The tenets presented here are grounded in the well-established literature on reporting and evaluating qualitative research, including guidelines and checklists ( Eakin & Mykhalovskiy, 2003 ; Elo et al., 2014 ; Mays & Pope, 2000 ; Tong, Sainsbury, & Craig, 2007 ). For example, the Consolidated Criteria for Reporting Qualitative Research checklist describes essential elements for reporting qualitative findings ( Tong et al., 2007 ). Although the considerations presented in the current manuscript have broad applicability to many fields, examples were purposively selected for the field of pediatric psychology.

Our goal is that this article will stimulate publication of more qualitative research in pediatric psychology and allied fields. More specifically, the goal is to encourage high-quality qualitative research by addressing key issues involved in conducting qualitative studies, and the process of conducting, reporting, and evaluating qualitative findings. Readers interested in more in-depth information on designing and implementing qualitative studies, relevant theoretical frameworks and approaches, and analytic approaches are referred to the well-developed literature in this area ( Clark, 2003 ; Corbin & Strauss, 2008 ; Creswell, 1994 ; Eakin & Mykhalovskiy, 2003 ; Elo et al., 2014 ; Mays & Pope, 2000 ; Miles, Huberman, & Saldaña, 2013 ; Ritchie & Lewis, 2003 ; Saldaña, 2012 ; Sandelowski, 1995 , 2010 ; Tong et al., 2007 ; Yin, 2015 ). Researchers new to qualitative research are also encouraged to obtain specialized training in qualitative methods and/or to collaborate with a qualitative expert in an effort to ensure rigor (i.e., validity).

We begin the article with a definition of qualitative research and an overview of the concept of rigor. While we recognize that qualitative methods comprise multiple and distinct approaches with unique purposes, we present an overview of considerations for writing and evaluating qualitative research that cut across qualitative methods. Specifically, we present basic principles in three broad areas: (1) study design and methods, (2) analytic considerations, and (3) presentation of findings (see Table 1 for a summary of the principles addressed in each area). Each area is addressed using a “question and answer” format. We present a brief explanation of each question, options for how one could address the issue raised, and a suggested recommendation. We recognize, however, that there are no absolute “right” or “wrong” answers and that the most “right” answer for each situation depends on the specific study and its purpose. In fact, our strongest recommendation is that authors of qualitative research manuscripts be explicit about their rationale for design, analytic choices, and strategies so that readers and reviewers can evaluate the rationale and rigor of the study methods.

Summary of Overarching Principles to Address in Qualitative Research Manuscripts

What Is Qualitative Research?

Qualitative methods are used across many areas of health research, including health psychology ( Gough & Deatrick, 2015 ), to study the meaning of people’s lives in their real-world roles, represent their views and perspectives, identify important contextual conditions, discover new or additional insights about existing social and behavioral concepts, and acknowledge the contribution of multiple perspectives ( Yin, 2015 ). Qualitative research is a family of approaches rather than a single approach. There are multiple and distinct qualitative methodologies or stances (e.g., constructivism, post-positivism, critical theory), each with different underlying ontological and epistemological assumptions ( Lincoln, Lynham, & Guba, 2011 ). However, certain features are common to most qualitative approaches and distinguish qualitative research from quantitative research ( Creswell, 1994 ).

Key to all qualitative methodologies is that multiple perspectives about a phenomenon of interest are essential, and that those perspectives are best inductively derived or discovered from people with personal experience regarding that phenomenon. These perspectives or definitions may differ from “conventional wisdom.” Thus, meanings need to be discovered from the population under study to ensure optimal understanding. For instance, in a recent qualitative study about texting while driving, adolescents said that they did not approve of texting while driving. The investigators, however, discovered that the respondents did not consider themselves driving while a vehicle was stopped at a red light. In other words, the respondents did approve of texting while stopped at a red light. In addition, the adolescents said that they highly valued being constantly connected via texting. Thus, what is meant by “driving” and the value of “being connected” need to be considered when approaching the issue of texting while driving with adolescents ( McDonald & Sommers, 2015 ).

Qualitative methods are also distinct from a mixed-method approach (i.e., integration of qualitative and quantitative approaches; Creswell, 2013b ). A mixed-methods study may include a first phase of quantitative data collection that provides results that inform a second phase of the study that includes qualitative data collection, or vice versa. A mixed-methods study may also include concurrent quantitative and qualitative data collection. The timing, priority, and stage of integration of the two approaches (quantitative and qualitative) are complex and vary depending on the research question; they also dictate how to attend to differing qualitative and quantitative principles ( Creswell et al., 2011 ). Understanding the basic tenets of qualitative research is preliminary to integrating qualitative research with another approach that has different tenets. A full discussion of the integration of qualitative and quantitative research approaches is beyond the scope of this article. Readers interested in the topic are referred to one of the many excellent resources on the topic ( Creswell, 2013b ).

What Are Typical Qualitative Research Questions?

Qualitative research questions are typically open-ended and are framed in the spirit of discovery and exploration and to address existing knowledge gaps. The current manuscript provides exemplar pediatric qualitative studies that illustrate key issues that arise when reporting and evaluating qualitative studies. Example research questions that are contained in the studies cited in the current manuscript are presented in Table 2 .

Example Qualitative Research Questions From the Pediatric Literature

What Are Rigor and Transparency in Qualitative Research?

There are several overarching principles with unique application in qualitative research, including definitions of scientific rigor and the importance of transparency. Quantitative research generally uses the terms reliability and validity to describe the rigor of research, while in qualitative research, rigor refers to the goal of seeking to understand the tacit knowledge of participants’ conception of reality ( Polanyi, 1958 ). For example, Haukeland and colleagues (2015) used qualitative analysis to identify themes describing the emotional experiences of a unique and understudied population—pediatric siblings of children with rare medical conditions such as Turner syndrome and Duchenne muscular dystrophy. Within this context, the authors’ rendering of the diverse and contradictory emotions experienced by siblings of children with these rare conditions represents “rigor” within a qualitative framework.

While debate exists regarding the terminology describing and strategies for strengthening scientific rigor in qualitative studies ( Guba, 1981 ; Morse, 2015a , 2015b ; Sandelowski, 1993a ; Whittemore, Chase, & Mandle, 2001 ), little debate exists regarding the importance of explaining strategies used to strengthen rigor. Such strategies should be appropriate for the specific study; therefore, it is wise to clearly describe what is relevant for each study. For example, in terms of strengthening credibility or the plausibility of data analysis and interpretation, prolonged engagement with participants is appropriate when conducting an observational study (e.g., observations of parent–child mealtime interactions; Hughes et al., 2011 ; Power et al., 2015 ). For an interview-only study, however, it would be more practical to strengthen credibility through other strategies (e.g., keeping detailed field notes about the interviews included in the analysis).

Dependability is the stability of a data analysis protocol. For instance, stepwise development of a coding system from an “a priori” list of codes based on the underlying conceptual framework or existing literature (e.g., creating initial codes for potential barriers to medication adherence based on prior studies) may be essential for analysis of data from semi-structured interviews using multiple coders. But this may not be the ideal strategy if the purpose is to inductively derive all possible coding categories directly from data in an area where little is known. For some research questions, the strategy may be to strengthen confirmability or to verify a specific phenomenon of interest using different sources of data before generating conclusions. This process, which is commonly referred to in the research literature as triangulation, may also include collecting different types of data (e.g., interview data, observational data), using multiple coders to incorporate different ways of interpreting the data, or using multiple theories ( Krefting, 1991 ; Ritchie & Lewis, 2003 ). Alternatively, another investigator may use triangulation to provide complementarity data ( Krefting, 1991 ) to garner additional information to deepen understanding. Because the purpose of qualitative research is to discover multiple perspectives about a phenomenon, it is not necessarily appropriate to attain concordance across studies or investigators when independently analyzing data. Some qualitative experts also believe that it is inappropriate to use triangulation to confirm findings, but this debate has not been resolved within the field ( Ritchie & Lewis, 2003 ; Tobin & Begley, 2004 ). More agreement exists, however, regarding the value of triangulation to complement, deepen, or expand understanding of a particular topic or issue ( Ritchie & Lewis, 2003 ). Finally, instead of basing a study on a sample that allows for generalizing statistical results to other populations, investigators in qualitative research studies are focused on designing a study and conveying the results so that the reader understands the transferability of the results. Strategies for transferability may include explanations of how the sample was selected and descriptive characteristics of study participants, which provides a context for the results and enables readers to decide if other samples share critical attributes. A study is deemed transferable if relevant contextual features are common to both the study sample and the larger population.

Strategies to enhance rigor should be used systematically across each phase of a study. That is, rigor needs to be identified, managed, and documented throughout the research process: during the preparation phase (data collection and sampling), organization phase (analysis and interpretation), and reporting phase (manuscript or final report; Elo et al., 2014 ). From this perspective, the strategies help strengthen the trustworthiness of the overall study (i.e., to what extent the study findings are worth heeding; Eakin & Mykhalovskiy, 2003 ; Lincoln & Guba, 1985 ).

A good example of managing and documenting rigor and trustworthiness can be found in a study of family treatment decisions for children with cancer ( Kelly & Ganong, 2011 ). The researchers describe how they promoted the rigor of the study and strengthening its credibility by triangulating data sources (e.g., obtaining data from children’s custodial parents, stepparents, etc.), debriefing (e.g., holding detailed conversations with colleagues about the data and interpretations of the data), member checking (i.e., presenting preliminary findings to participants to obtain their feedback and interpretation), and reviewing study procedure decisions and analytic procedures with a second party.

Transparency is another key concept in written reports of qualitative research. In other words, enough detail should be provided for the reader to understand what was done and why ( Ritchie & Lewis, 2003 ). Examples of information that should be included are a clear rationale for selecting a particular population or people with certain characteristics, the research question being investigated, and a meaningful explanation of why this research question was selected (i.e., the gap in knowledge or understanding that is being investigated; Ritchie & Lewis, 2003 ). Clearly describing recruitment, enrollment, data collection, and data analysis or extraction methods are equally important ( Dixon-Woods, Shaw, Agarwal, & Smith, 2004 ). Coherency among methods and transparency about research decisions adds to the robustness of qualitative research ( Tobin & Begley, 2004 ) and provides a context for understanding the findings and their implications.

Study Design and Methods

Is qualitative research hypothesis driven.

In contrast to quantitative research, qualitative research is not typically hypothesis driven ( Creswell, 1994 ; Ritchie & Lewis, 2003 ). A risk associated with using hypotheses in qualitative research is that the findings could be biased by the hypotheses. Alternatively, qualitative research is exploratory and typically guided by a research question or conceptual framework rather than hypotheses ( Creswell, 1994 ; Ritchie & Lewis, 2003 ). As previously stated, the goal of qualitative research is to increase understanding in areas where little is known by developing deeper insight into complex situations or processes. According to Richards and Morse (2013) , “If you know what you are likely to find, …  you should not be working qualitatively” (p. 28). Thus, we do not recommend that a hypothesis be stated in manuscripts presenting qualitative data.

What Is the Role of Theory in Qualitative Research?

Consistent with the exploratory nature of qualitative research, one particular qualitative method, grounded theory, is used specifically for discovering substantive theory (i.e., working theories of action or processes developed for a specific area of concern; Bryant & Charmaz, 2010 ; Glaser & Strauss, 1967 ). This method uses a series of structured steps to break down qualitative data into codes, organize the codes into conceptual categories, and link the categories into a theory that explains the phenomenon under study. For example, Kelly and Ganong (2011) used grounded theory methods to produce a substantive theory about how single and re-partnered parents (e.g., households with a step-parent) made treatment decisions for children with childhood cancer. The theory of decision making developed in this study included “moving to place,” which described the ways in which parents from different family structures (e.g., single and re-partnered parents) were involved in the child’s treatment decision-making. The resulting theory also delineated the causal conditions, context, and intervening factors that contributed to the strategies used for moving to place.

Theories may be used in other types of qualitative research as well, serving as the impetus or organizing framework for the study ( Sandelowski, 1993b ). For example, Izaguirre and Keefer (2014) used Social Cognitive Theory ( Bandura, 1986 ) to investigate self-efficacy among adolescents with inflammatory bowel disease. The impetus for selecting the theory was to inform the development of a self-efficacy measure for adolescent self-management. In another study on health care transition in youth with Type 1 Diabetes ( Pierce, Wysocki, & Aroian, 2016 ), the investigators adapted a social-ecological model—the Socio-ecological Model of Adolescent and Young Adult Transition Readiness (SMART) model ( Schwartz, Tuchman, Hobbie, & Ginsberg, 2011 )—to their study population ( Pierce & Wysocki, 2015 ). Pierce et al. (2016) are currently using the adapted SMART model to focus their data collection and structure the preliminary analysis of their data about diabetes health care transition.

Regardless of whether theory is induced from data or selected in advance to guide the study, consistent with the principle of transparency , its role should be clearly identified and justified in the research publication ( Bradbury-Jones, Taylor, & Herber, 2014 ; Kelly, 2010 ). Methodological congruence is an important guiding principle in this regard ( Richards & Morse, 2013 ). If a theory frames the study at the outset, it should guide and direct all phases. The resulting publication(s) should relate the phenomenon of interest and the research question(s) to the theory and specify how the theory guided data collection and analysis. The publication(s) should also discuss how the theory fits with the finished product. For instance, authors should describe how the theory provided a framework for the presentation of the findings and discuss the findings in context with the relevant theoretical literature.

A study examining parents’ motivations to promote vegetable consumption in their children ( Hingle et al., 2012 ) provides an example of methodological congruence. The investigators adapted the Model of Goal Directed Behavior ( Bagozzi & Pieters, 1998 ) for parenting practices relevant to vegetable consumption (Model of Goal Directed Vegetable Parenting Practices; MGDVPP). Consistent with the adapted theoretical model and in keeping with the congruence principle, interviews were guided by the theoretical constructs contained within the MGDVPP, including parents’ attitudes, subjective norms, and perceived behavioral control related to promoting vegetable consumption in children ( Hingle et al., 2012 ). The study discovered that the adapted model successfully identified parents’ motivations to encourage their children to eat more vegetables.

The use of the theory should be consistent with the basic goal of qualitative research, which is discovery. Alternatively stated, theories should be used as broad orienting frameworks for exploring topical areas without imposing preconceived ideas and biases. The theory should be consistent with the study findings and not be used to force-fit the researcher’s interpretation of the data ( Sandelowski, 1993b ). Divergence from the theory when it does not fit the study findings is illustrated in a qualitative study of hypertension prevention beliefs in Hispanics ( Aroian, Peters, Rudner, & Waser, 2012 ). This study used the Theory of Planned Behavior as a guiding theoretical framework but found that coding separately for normative and control beliefs was not the best organizing schema for presenting the study findings. When divergence from the original theory occurs, the research report should explain and justify how and why the theory was modified ( Bradbury-Jones et al., 2014 ).

What Are Typical Sampling Methods in Qualitative Studies?

Qualitative sampling methods should be “purposeful” ( Coyne, 1997 ; Patton, 2015 ; Tuckett, 2004 ). Purposeful sampling is based on the study purpose and investigator judgments about which people and settings will provide the richest information for the research questions. The logic underlying this type of sampling differs from the logic underlying quantitative sampling ( Patton, 2015 ). Quantitative research strives for empirical generalization. In qualitative studies, generalizability beyond the study sample is typically not the intent; rather, the focus is on deriving depth and context-embedded meaning for the relevant study population.

Purposeful sampling is a broad term. Theoretical sampling is one particular type of purposeful sampling unique to grounded theory methods ( Coyne, 1997 ). In theoretical sampling, study participants are chosen according to theoretical categories that emerge from ongoing data collection and analyses ( Bryant & Charmaz, 2010 ). Data collection and analysis are conducted concurrently to allow generating and testing hypotheses that emerge from analyzing incoming data. The following example from the previously mentioned qualitative interview study about transition from pediatric to adult care in adolescents with type 1 diabetes ( Pierce et al., 2016 ) illustrates the process of theoretical sampling: An adolescent study participant stated that he was “turned off” by the “childish” posters in his pediatrician’s office. He elaborated that he welcomed transitioning to adult care because his diabetes was discovered when he was 18, an age when he reportedly felt more “mature” than most pediatric patients. These data were coded as “developmental misfit” and prompted a tentative hypothesis about developmental stage at entry for pediatric diabetes care and readiness for health care transition. Examining this hypothesis prompted seeking study participants who varied according to age or developmental stage at time of diagnosis to examine the theoretical relevance of an emerging theme about developmental fit.

Not all purposeful sampling, however, is “theoretical.” For example, ethnographic studies typically seek to understand a group’s cultural beliefs and practices ( Creswell, 2013a ). Consistent with this purpose, researchers conducting an ethnographic study might purposefully select study participants according to specific characteristics that reflect the social roles and positions in a given group or society (e.g., socioeconomic status, education; Johnson, 1990 ).

Random sampling is generally not used in qualitative research. Random selection requires a sufficiently large sample to maximize the potential for chance and, as will be discussed below, sample size is intentionally small in qualitative studies. However, random sampling may be used to verify or clarify findings ( Patton, 2015 ). Validating study findings with a randomly selected subsample can be used to address the possibility that a researcher is inadvertently giving greater attention to cases that reinforce his or her preconceived ideas.

Regardless of the sampling method used, qualitative researchers should clearly describe the sampling strategy and justify how it fits the study when reporting study findings (transparency). A common error is to refer to theoretical sampling when the cases were not chosen according to emerging theoretical concepts. Another common error is to apply sampling principles from quantitative research (e.g., cluster sampling) to convince skeptical reviewers about the rigor or validity of qualitative research. Rigor is best achieved by being purposeful, making sound decisions, and articulating the rationale for those decisions. As mentioned earlier in the discussion of transferability , qualitative researchers are encouraged to describe their methods of sample selection and descriptive characteristics about their sample so that readers and reviewers can judge how the current sample may differ from others. Understanding the characteristics of each qualitative study sample is essential for the iterative nature of qualitative research whereby qualitative findings inform the development of future qualitative, quantitative, or mixed-methods studies. Reviewers should evaluate sampling decisions based on how they fit the study purpose and how they influence the quality of the end product.

What Sample Size Is Needed for Qualitative Research?

No definitive rules exist about sample size in qualitative research. However, sample sizes are typically smaller than those in quantitative studies ( Patton, 2015 ). Small samples often generate a large volume of data and information-rich cases, ultimately leading to insight regarding the phenomenon under study ( Patton, 2015 ; Ritchie & Lewis, 2003 ). Sample sizes of 20–30 cases are typical, but a qualitative sample can be even smaller under some circumstances ( Mason, 2010 ).

Sample size adequacy is evaluated based on the quality of the study findings, specifically the full development of categories and inter-relationships or the adequacy of information about the phenomenon under study ( Corbin & Strauss, 2008 ; Ritchie & Lewis, 2003 ). Small sample sizes are of concern if they do not result in these outcomes. Data saturation (i.e., the point at which no new information, categories, or themes emerge) is often used to judge informational adequacy ( Morgan, 1998 ; Ritchie & Lewis, 2003 ). Although enough participants should be included to obtain saturation ( Morgan, 1998 ), informational adequacy pertains to more than sample size. It is also a function of the quality of the data, which is influenced by study participant characteristics (e.g., cognitive ability, knowledge, representativeness) and the researcher’s data-gathering skills and analytical ability to generate meaningful findings ( Morse, 2015b ; Patton, 2015 ).

Sample size is also influenced by type of qualitative research, the study purpose, the sample, the depth and complexity of the topic investigated, and the method of data collection. In general, the more heterogeneous the sample, the larger the sample size, particularly if the goal is to investigate similarities and differences by specific characteristics ( Ritchie & Lewis, 2003 ). For instance, in a study to conduct an initial exploration of factors underlying parents’ motivations to use good parenting practices, theoretical saturation (i.e., the point at which no new information, categories, or themes emerge) was obtained with a small sample ( n  = 15), most likely because the study was limited to parents of young children ( Hingle et al., 2012 ). If the goal of the study had been, for example, to identify racial/ethnic, gender, or age differences in food parenting practices, a larger sample would likely be needed to obtain saturation or informational adequacy.

Studies that seek to understand maximum variation in a phenomenon might also need a larger sample than one that is seeking to understand extreme or atypical cases. For example, a qualitative study of diet and physical activity in young Australian men conducted focus groups to identify perceived motivators and barriers to healthy eating and physical activity and examine the influence of body weight on their perceptions. Examining the influence of body weight status required 10 focus groups to allow for group assignment based on body mass index ( Ashton et al., 2015 ). More specifically, 61 men were assigned to a healthy-weight focus group ( n  = 3), an overweight/obese focus group ( n  = 3), or a mixed-weight focus group ( n  = 4). Had the researcher not been interested in whether facilitators and barriers differed by weight status, its likely theoretical saturation could have been obtained with fewer groups. Depth of inquiry also influences sample size ( Sandelowski, 1995 ). For instance, an in-depth analysis of an intervention for children with cancer and their families included 16 family members from three families. Study data comprised 52 hrs of videotaped intervention sessions and 10 interviews ( West, Bell, Woodgate, & Moules, 2015 ). Depth was obtained through multiple data points and types of data, which justified sampling only a few families.

Authors of publications describing qualitative findings should show evidence that the data were “saturated” by a sample with sufficient variation to permit detailing shared and divergent perspectives, meanings, or experiences about the topic of inquiry. Decisions related to the sample (e.g., targeted recruitment) should be detailed in publications so that peer reviewers have the context for evaluating the sample and determining how the sample influenced the study findings ( Patton, 2015 ).

Qualitative Data Analysis

When conducting qualitative research, voluminous amounts of data are gathered and must be prepared (i.e., transcribed) and managed. During the analytic process, data are systematically transformed through identifying, defining, interpreting, and describing findings that are meant to comprehensively describe the phenomenon or the abstract qualities that they have in common. The process should be systematic ( dependability ) and well-documented in the analysis section of a qualitative manuscript. For example, Kelly and Ganong (2011) , in their study of medical treatment decisions made by families of children with cancer, described their analytic procedure by outlining their approach to coding and use of memoing (e.g., keeping careful notes about emerging ideas about the data throughout the analytic process), comparative analysis (e.g., comparing data against one another and looking for similarities and differences), and diagram drawing (e.g., pictorially representing the data structure, including relationships between codes).

How Should Researchers Document Coding Reliability?

Because the intent of qualitative research is to account for multiple perspectives, the goal of qualitative analysis is to comprehensively incorporate those perspectives into discernible findings. Researchers accustomed to doing quantitative studies may expect authors to quantify interrater reliability (e.g., kappa statistic) but this is not typical in qualitative research. Rather, the emphasis in qualitative research is on (1) training those gathering data to be rigorous and produce high-quality data and on (2) using systematic processes to document key decisions (e.g., code book), clear direction, and open communication among team members during data analysis. The goal is to make the most of the collective insight of the investigative team to triangulate or complement each other’s efforts to process and interpret the data. Instead of evaluating if two independent raters came to the same numeric rating, reviewers of qualitative manuscripts should judge to what extent the overall process of coding, data management, and data interpretation were systematic and rigorous. Authors of qualitative reports should articulate their coding procedures for others to evaluate. Together, these strategies promote trustworthiness of the study findings.

An example of how these processes are described in the report of a qualitative study is as follows:

The first two authors independently applied the categories to a sample of two interviews and compared their application of the categories to identify lack of clarity and overlap in categories. The investigators created a code book that contained a definition of categories, guidelines for their application, and excerpts of data exemplifying the categories. The first two authors independently coded the data and compared how they applied the categories to the data and resolved any differences during biweekly meetings. ATLAS.ti, version 6.2, was used to document and accommodate ongoing changes and additions to the coding structure ( Palma et al., 2015 , p. 224).

Do I Need to Use a Specialized Qualitative Data Software Program for Analysis?

Multiple computer software packages for qualitative data analysis are currently available ( Silver & Lewins, 2014 ; Yin, 2015 ). These packages allow the researcher to import qualitative data (e.g., interview transcripts) into the software program and organize data segments (e.g., delineate which interview excerpts are relevant to particular themes). Qualitative analysis software can be useful for organizing and sorting through data, including during the analysis phase. Some software programs also offer sophisticated coding and visualization capabilities that facilitate and enhance interpretation and understanding. For example, if data segments are coded by specific characteristics (e.g., gender, race/ethnicity), the data can be sorted and analyzed by these characteristics, which may contribute to an understanding of whether and/or how a particular phenomenon may vary by these characteristics.

The strength of computer software packages for qualitative data analysis is their potential to contribute to methodological rigor by organizing the data for systematic analyses ( John & Johnson, 2000 ; MacMillan & Koenig, 2004 ). However, the programs do not replace the researchers’ analyses. The researcher or research team is ultimately responsible for analyzing the data, identifying the themes and patterns, and placing the findings within the context of the literature. In other words, qualitative data analysis software programs contribute to, but do not ensure scientific rigor or “objectivity” in, the analytic process. In fact, using a software program for analysis is not essential if the researcher demonstrates the use of alternative tools and procedures for rigor.

Presentation of Findings

Should there be overlap between presentation of themes in the results and discussion sections.

Qualitative papers sometimes combine results and discussion into one section to provide a cohesive presentation of the findings along with meaningful linkages to the existing literature ( Burnard, 2004 ; Burnard, Gill, Stewart, Treasure, & Chadwick, 2008 ). Although doing so is an acceptable method for reporting qualitative findings, some journals prefer the two sections to be distinct.

When the journal style is to distinguish the two sections, the results section should describe the findings, that is, the themes, while the discussion section should pull the themes together to make larger-level conclusions and place the findings within the context of the existing literature. For instance, the findings section of a study of how rural African-American adolescents, parents, and community leaders perceived obesity and topics for a proposed obesity prevention program, contained a description of themes about adolescent eating patterns, body shape, and feedback on the proposed weight gain prevention program according to each subset of participants (i.e., adolescents, parents, community leaders). The discussion section then put these themes within the context of findings from prior qualitative and intervention studies in related populations ( Cassidy et al., 2013 ). In the Discussion, when making linkages to the existing literature, it is important to avoid the temptation to extrapolate beyond the findings or to over-interpret them ( Burnard, 2004 ). Linkages between the findings and the existing literature should be supported by ample evidence to avoid spurious or misleading connections ( Burnard, 2004 ).

What Should I Include in the Results Section?

The results section of a qualitative research report is likely to contain more material than customary in quantitative research reports. Findings in a qualitative research paper typically include researcher interpretations of the data as well as data exemplars and the logic that led to researcher interpretations ( Sandelowski & Barroso, 2002 ). Interpretation pertains to the researcher breaking down and recombining the data and creating new meanings (e.g., abstract categories, themes, conceptual models). Select quotes from interviews or other types of data (e.g., participant observation, focus groups) are presented to illustrate or support researcher interpretations. Researchers trained in the quantitative tradition, where interpretation is restricted to the discussion section, may find this surprising; however, in qualitative methods, researcher interpretations represent an important component of the study results. The presentation of the findings, including researcher interpretations (e.g., themes) and data (e.g., quotes) supporting those interpretations, adds to the trustworthiness of the study ( Elo et al., 2014 ).

The Results section should contain a balance between data illustrations (i.e., quotes) and researcher interpretations ( Lofland & Lofland, 2006 ; Sandelowski, 1998 ). Because interpretation arises out of the data, description and interpretation should be combined. Description should be sufficient to support researcher interpretations, and quotes should be used judiciously ( Morrow, 2005 ; Sandelowski, 1994 ). Not every theme needs to be supported by multiple quotes. Rather, quotes should be carefully selected to provide “voice” to the participants and to help the reader understand the phenomenon from the participant’s perspective within the context of the researcher’s interpretation ( Morrow, 2005 ; Ritchie & Lewis, 2003 ). For example, researchers who developed a grounded theory of sexual risk behavior of urban American Indian adolescent girls identified desire for better opportunities as a key deterrent to neighborhood norms for early sexual activity. They illustrated this theme with the following quote: “I don’t want to live in the ‘hood and all that…My sisters are stuck there because they had babies. That isn’t going to happen to me” ( Saftner, Martyn, Momper, Loveland-Cherry, & Low, 2015 , p. 372).

There is no precise formula for the proportion of description to interpretation. Both descriptive and analytic excess should be avoided ( Lofland & Lofland, 2006 ). The former pertains to presentation of unedited field notes or interview transcripts rather than selecting and connecting data to analytic concepts that explain or summarize the data. The latter pertains to focusing on the mechanics of analysis and interpretation without substantiating researcher interpretations with quotes. Reviewer requests for methodological rigor can result in researchers writing qualitative research papers that suffer from analytic excess ( Sandelowski & Barroso, 2002 ). Page limitations of most journals provide a safeguard against descriptive excess, but page limitations should not circumvent researchers from providing the basis for their interpretations.

Additional potential problems with qualitative results sections include under-elaboration, where themes are too few and not clearly defined. The opposite problem, over-elaboration, pertains to too many analytic distinctions that could be collapsed under a higher level of abstraction. Quotes can also be under- or over-interpreted. Care should be taken to ensure the quote(s) selected clearly support the theme to which they are attached. And finally, findings from a qualitative study should be interesting and make clear contributions to the literature ( Lofland & Lofland, 2006 ; Morse, 2015b ).

Should I Quantify My Results? (e.g., Frequency With Which Themes Were Endorsed)

There is controversy over whether to quantify qualitative findings, such as providing counts for the frequency with which particular themes are endorsed by study participants ( Morgan, 1993 ; Sandelowski, 2001 ). Qualitative papers usually report themes and patterns that emerge from the data without quantification ( Dey, 1993 ). However, it is possible to quantify qualitative findings, such as in qualitative content analysis. Qualitative content analysis is a method through which a researcher identifies the frequency with which a phenomenon, such as specific words, phrases, or concepts, is mentioned ( Elo et al., 2014 ; Morgan, 1993 ). Although this method may appeal to quantitative reviewers, it is important to note that this method only fits specific study purposes, such as studies that investigate the language used by a particular group when communicating about a specific topic. In addition, results may be quantified to provide information on whether themes appeared to be common or atypical. Authors should avoid using imprecise language, such as “some participants” or “many participants.” A good example of quantification of results to illustrate more or less typical themes comes from a manuscript describing a qualitative study of school nurses’ perceived barriers to addressing obesity with students and their families. The authors described that all but one nurse reported not having the resources they needed to discuss weight with students and families whereas one-quarter of nurses reported not feeling competent to discuss weight issues ( Steele et al., 2011 ). If quantification of findings is used, authors should provide justification that explains how quantification is consistent with the aims or goals of the study ( Sandelowski, 2001 ).

Conclusions

This article highlighted key theoretical and logistical considerations that arise in designing, conducting, and reporting qualitative research studies (see Table 1 for a summary). This type of research is vital for obtaining patient, family, community, and other stakeholder perspectives about their needs and interests, and will become increasingly critical as our models of health care delivery evolve. For example, qualitative research could contribute to the study of health care providers and systems with the goal of optimizing our health care delivery models. Given the increasing diversity of the populations we serve, qualitative research will also be critical in providing guidance in how to tailor health interventions to key characteristics and increase the likelihood of acceptable, effective treatment approaches. For example, applying qualitative research methods could enhance our understanding of refugee experiences in our health care system, clarify treatment preferences for emerging adults in the midst of health care transitions, examine satisfaction with health care delivery, and evaluate the applicability of our theoretical models of health behavior changes across racial and ethnic groups. Incorporating patient perspectives into treatment is essential to meeting this nation’s priority on patient-centered health care ( Institute of Medicine Committee on Quality of Health Care in America, 2001 ). Authors of qualitative studies who address the methodological choices addressed in this review will make important contributions to the field of pediatric psychology. Qualitative findings will lead to a more informed field that addresses the needs of a wide range of patient populations and produces effective and acceptable population-specific interventions to promote health.

Acknowledgments

The authors thank Bridget Grahmann for her assistance with manuscript preparation.

This work was supported by National Cancer Institute of the National Institutes of Health (K07CA196985 to Y.W.). This work is a publication of the United States Department of Agriculture/Agricultural Research Center (USDA/ARS), Children’s Nutrition Research Center, Department of Pediatrics, Baylor College of Medicine, Houston, Texas. It is also a publication of the USDA/ARS, Children’s Nutrition Research Center, Department of Pediatrics, Baylor College of Medicine, Houston, Texas, and funded in part with federal funds from the USDA/ARS under Cooperative Agreement No. 58‐6250‐0‐008 (to D.T.). The contents of this publication do not necessarily reflect the views or policies of the USDA, nor does mention of trade names, commercial products, or organizations imply endorsement from the U.S. government. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Conflicts of interest : None declared.

  • Aroian K. J., Peters R. M., Rudner N., Waser L. (2012). Hypertension prevention beliefs of hispanics . Journal of Transcultural Nursing , 23 , 134–142. doi:10.1177/1043659611433871. [ PubMed ] [ Google Scholar ]
  • Ashton L. M., Hutchesson M. J., Rollo M. E., Morgan P. J., Thompson D. I., Collins C. E. (2015). Young adult males’ motivators and perceived barriers towards eating healthily and being active: A qualitative study . The International Journal of Behavioral Nutrition and Physical Activity , 12 , 93 doi:10.1186/s12966‐015‐0257‐6. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Bagozzi R., Pieters R. (1998). Goal-directed emotions . Cognition & Emotion , 12 ( 1 ), 1–26. [ Google Scholar ]
  • Bandura A. (1986). Social foundations of thought and action: A social cognitive theory . Englewood Cliffs, NJ: Prentice-Hall Inc. [ Google Scholar ]
  • Bevans K. B., Gardner W., Pajer K., Riley A. W., Forrest C. B. (2013). Qualitative development of the PROMIS ® pediatric stress response item banks . Journal of Pediatric Psychology , 38 , 173–191. doi:10.1093/jpepsy/jss107. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Bradbury-Jones C., Taylor J., Herber O. (2014). How theory is used and articulated in qualitative research: Development of a new typology . Social Science and Medicine , 120 , 135–141. doi:10.1016/j.socscimed.2014.09.014. [ PubMed ] [ Google Scholar ]
  • Bryant A., Charmaz K. (2010). The Sage handbook of grounded theory . Thousand Oaks, CA: Sage. [ Google Scholar ]
  • Burnard P. (2004). Writing a qualitative research report . Nurse Education Today , 24 , 174–179. doi:10.1016/j.nedt.2003.11.005. [ PubMed ] [ Google Scholar ]
  • Burnard P., Gill P., Stewart K., Treasure E., Chadwick B. (2008). Analysing and presenting qualitative data . British Dental Journal , 204 , 429–432. doi:10.1038/sj.bdj.2008.292. [ PubMed ] [ Google Scholar ]
  • Cassidy O., Sbrocco T., Vannucci A., Nelson B., Jackson-Bowen D., Heimdal J., Heimdal J., Mirza N., Wilfley D. E., Osborn R., Shomaker L. B., Young J. F., Waldron H., Carter M., Tanofsky-Kraff M., (2013). Adapting interpersonal psychotherapy for the prevention of excessive weight gain in rural African American girls . Journal of Pediatric Psychology , 38 , 965–977. doi:10.1093/jpepsy/jst029. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Clark J. (2003). How to peer review a qualitative manuscript . Peer Review in Health Sciences , 2 , 219–235. [ Google Scholar ]
  • Corbin S., Strauss A. (2008). Basics of qualitative research (3rd ed.). Los Angeles, CA: Sage Publications. [ Google Scholar ]
  • Coyne I. T. (1997). Sampling in qualitative research. Purposeful and theoretical sampling; merging or clear boundaries? Journal of Advanced Nursing , 26 , 623–630. doi:10.1046/j.1365‐2648.1997.t01‐25‐00999.x. [ PubMed ] [ Google Scholar ]
  • Creswell J. W. (1994). Research design: Qualitative & quantitative approaches . Journal of Marketing Research , 33 , 252 doi:10.2307/3152153. [ Google Scholar ]
  • Creswell J. W. (2013a). Qualitative inquiry and research design: Choosing among five approaches . Thousand Oaks, CA: Sage Publications. [ Google Scholar ]
  • Creswell J. W. (2013b). Research design: Qualitative, quantitative, and mixed methods approaches (4th ed.). Thousand Oaks, CA: Sage Publications. [ Google Scholar ]
  • Creswell J. W., Klassen A. C., Plano Clark V. L., Smith K. C.;for the Office of Behavioral and Social Sciences Research. (2011). Best practices for mixed methods research in the health sciences . Retrieved from National Institutes of Health: http://obssr.od.nih.gov/mixed_methods_research .
  • de Visser R. O., Graber R., Hart A., Abraham C., Scanlon T., Watten P., Memon A. (2015). Using qualitative methods within a mixed-methods approach to developing and evaluating interventions to address harmful alcohol use among young people . Health Psychology , 34 , 349–360. doi:10.1037/hea0000163. [ PubMed ] [ Google Scholar ]
  • Dey I. (1993). Qualitative data analysis: A user-friendly guide for social scientists . New York, NY: Routledge. [ Google Scholar ]
  • Dixon-Woods M., Shaw R. L., Agarwal S., Smith J. A. (2004). The problem of appraising qualitative research . Quality and Safety in Health Care , 13 , 223–225. doi:10.1136/qhc.13.3.223. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Eakin J. M., Mykhalovskiy E. (2003). Reframing the evaluation of qualitative health research: Reflections on a review of appraisal guidelines in the health sciences . Journal of Evaluation in Clinical Practice , 9 , 187–194. doi:10.1046/j.1365‐2753.2003.00392.x. [ PubMed ] [ Google Scholar ]
  • Elo S., Kääriäinen M., Kanste O., Pölkki T., Utriainen K., Kyngäs H. (2014). Qualitative content analysis: A focus on trustworthiness . SAGE Open , 4 ( 1 ), 1–10. doi:10.1177/2158244014522633. [ Google Scholar ]
  • Glaser B., Strauss A. (1967). The discovery grounded theory: Strategies for qualitative inquiry . Nursing Research , 17 , 364 doi:10.1097/00006199‐196807000‐00014. [ Google Scholar ]
  • Gough B., Deatrick J. A. (2015). Qualitative health psychology research: Diversity, power, and impact . Health Psychology , 34 , 289–292. doi:10.1037/hea0000206. [ PubMed ] [ Google Scholar ]
  • Guba E. G. (1981). Criteria for assessing the trustworthiness of naturalistic inquiries . Educational Communication and Technology , 29 , 75–91. doi:10.1007/BF02766777. [ Google Scholar ]
  • Haukeland Y. B., Fjermestad K. W., Mossige S., Vatne T. M. (2015). Emotional experiences among siblings of children with rare disorders . Journal of Pediatric Psychology , 40 , 12–20. doi:10.1093/jpepsy/jsv022. [ PubMed ] [ Google Scholar ]
  • Hess J. S., Straub D. M. (2011). Brief report: Preliminary findings from a pilot health care transition education intervention for adolescents and young adults with special health care needs . Journal of Pediatric Psychology , 36 , 172–178. doi:10.1093/jpepsy/jsq091. [ PubMed ] [ Google Scholar ]
  • Hingle M., Beltran A., O’Connor T., Thompson D., Baranowski J., Baranowski T. (2012). A model of goal directed vegetable parenting practices . Appetite , 58 , 444–449. doi:10.1016/j.appet.2011.12.011. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Hughes S. O., Power T. G., Papaioannou M. A., Cross M. B., Nicklas T. A., Hall S. K., Shewchuk R. M. (2011). Emotional climate, feeding practices, and feeding styles: An observational analysis of the dinner meal in Head Start families . The International Journal of Behavavioral Nutrition and Physical Activity , 8 , 60 doi:10.1186/1479‐5868‐8‐60. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Institute of Medicine Committee on Quality of Health Care in America. (2001). Crossing the quality chasm: A new health system for the 21st century . National Academies Press. Washington, DC. [ Google Scholar ]
  • Izaguirre M. R., Keefer L. (2014). Development of a self-efficacy scale for adolescents and young adults with inflammatory bowel disease . Journal of Pediatric Gastroenterology and Nutrition , 59 , 29–32. doi:10.1097/mpg.0000000000000357. [ PubMed ] [ Google Scholar ]
  • John W. S., Johnson P. (2000). The pros and cons of data analysis software for qualitative research . Journal of Nursing Scholarship , 32 , 393–397. [ PubMed ] [ Google Scholar ]
  • Johnson J. C. (1990). Selecting ethnographic informants . Sage Publications. Thousand Oaks, CA. [ Google Scholar ]
  • Kars M. C., Grypdonck M. H., de Bock L. C., van Delden J. J. (2015). The parents’ ability to attend to the “voice of their child” with incurable cancer during the palliative phase . Health Psychology , 34 , 446–452. doi:10.1037/hea0000166. [ PubMed ] [ Google Scholar ]
  • Kelly K., Ganong L. (2011). Moving to place: Childhood cancer treatment decision making in single-parent and repartnered family structures . Qualitative Health Research , 21 , 349–364. doi:10.1177/1049732310385823. [ PubMed ] [ Google Scholar ]
  • Kelly M. (2010). The role of theory in qualitative health research . Family Practice , 27 , 285–290. doi:10.1093/fampra/cmp077. [ PubMed ] [ Google Scholar ]
  • Krefting L. (1991). Rigor in qualitative research: The assessment of trustworthiness . The American Journal of Occupational Therapy , 45 , 214–222. doi:10.5014/ajot.45.3.214. [ PubMed ] [ Google Scholar ]
  • Lincoln Y. S., Guba E. G. (1985). Naturalistic inquiry . Newbury Park, CA: Sage Publications. [ Google Scholar ]
  • Lincoln Y. S., Lynham S. A., Guba E. G. (2011). Paradigmatic controversies, contradictions, and emerging confluences, revisited . In Denzin N. K., Lincoln Y. S. (Eds.), The Sage handbook of qualitative research (4th ed., pp. 97–128). Thousand Oaks, CA: Sage. [ Google Scholar ]
  • Lofland J., Lofland L. H. (2006). Analyzing social settings: A guide to qualitative observation and analysis . Belmont, CA: Wadsworth Publishing Company. [ Google Scholar ]
  • Lyons A. C., Goodwin I., McCreanor T., Griffin C. (2015). Social networking and young adults’ drinking practices: Innovative qualitative methods for health behavior research . Health Psychology , 34 , 293–302. doi:10.1037/hea0000168. [ PubMed ] [ Google Scholar ]
  • MacMillan K., Koenig T. (2004). The wow factor: Preconceptions and expectations for data analysis software in qualitative research . Social Science Computer Review , 22 , 179–186. doi:10.1177/0894439303262625. [ Google Scholar ]
  • Mason M. (Producer). (2010). Sample size and saturation in PhD studies using qualitative interviews . Forum: Qualitative Social Research . Retrieved from http://nbn-resolving.de/urn:nbn:de:0114-fqs100387 .
  • Mays N., Pope C. (2000). Qualitative research in health care: Assessing quality in qualitative research . British Medical Journal , 320 , 50 doi:10.1136/bmj.320.7226.50. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • McDonald C. C., Sommers M. S. (2015). Teen drivers’ perceptions of inattention and cell phone use while eriving . Traffic Injury Prevention , 16 ( Suppl 2 ), S52–S58. doi:10.1080/15389588.2015.1062886. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Miles M. B., Huberman A. M., Saldaña J. (2013). Qualitative data analysis: A methods sourcebook (3rd ed.). Thousand Oaks, CA: Sage Publications. [ Google Scholar ]
  • Minges K. E., Owen N., Salmon J., Chao A., Dunstan D. W., Whittemore R. (2015). Reducing youth screen time: Qualitative metasynthesis of findings on barriers and facilitators . Health Psychology , 34 , 381–397. doi:10.1037/hea0000172. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Morgan D. L. (1993). Qualitative content analysis: A guide to paths not taken . Qualitative Health Research , 3 , 112–121. doi:10.1177/104973239300300107. [ PubMed ] [ Google Scholar ]
  • Morgan D. L. (1998). Planning Focus Groups: Focus Group Kit #2 . Thousand Oaks, CA: Sage Publications. [ Google Scholar ]
  • Morrow S. (2005). Quality and trustworthiness in qualitative research in counseling psychology . Journal of Counseling Psychology , 52 , 250–260. doi:10.1037/0022‐0167.52.2.250. [ Google Scholar ]
  • Morse J. M. (2015a). Critical analysis of strategies for determining rigor in qualitative inquiry . Qualitative Health Research , 25 , 1212–1222. doi:10.1177/1049732315588501. [ PubMed ] [ Google Scholar ]
  • Morse J. M. (2015b). Data were saturated . Qualitative Health Research , 25 , 587–588. doi:10.1177/1049732315576699. [ PubMed ] [ Google Scholar ]
  • Palermo T. M. (2013). New guidelines for publishing review articles in JPP: Systematic reviews and topical reviews . Journal of Pediatric Psychology , 38 , 5–9. doi:10.1093/jpepsy/jss124. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Palermo T. M. (2014). Evidence-based interventions in pediatric psychology: Progress over the decades . Journal of Pediatric Psychology , 39 , 753–762. doi:10.1093/jpepsy/jsu048. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Palma E., Deatrick J., Hobbie W., Ogle S., Kobayashi K., Maldonado L. (2015). Maternal caregiving demands for adolescent and young adult survivors of pediatric brain tumors . Oncology Nursing Forum , 42 , 222–229. doi:10.1188/15.ONF.. [ PubMed ] [ Google Scholar ]
  • Patton M. Q. (2015). Qualitative research & evaluation methods: Integrating theory and practice (4th ed.). Thousand Oaks, CA: Sage Publications. [ Google Scholar ]
  • Pierce J. S., Wysocki T. (2015). Topical Review: Advancing research on the transition to adult care for type 1 diabetes . Journal of Pediatric Psychology , 40 , 1041–1047. doi:10.1093/jpepsy/jsv064. [ PubMed ] [ Google Scholar ]
  • Pierce J. S., Wysocki T., Aroian K. (2016). Multiple stakeholder perspectives on health care transition outcomes in Type 1 Diabetes . Unpublished data. [ Google Scholar ]
  • Polanyi M. (1958). Personal knowledge . New York, NY: Harper & Row. [ Google Scholar ]
  • Power T. G., Hughes S. O., Goodell L. S., Johnson S. L., Duran J. A., Williams K., Beck A. D., Frankel L. A. (2015). Feeding practices of low-income mothers: How do they compare to current recommendations? The International Journal of Behavioral Nutrition and Physical Activity , 12 , 34 doi:10.1186/s12966‐015‐0179‐3. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Richards L., Morse J. M. (2013). Readdme first for a user’s guide to qualitative methods (3rd ed.). Thousand Oaks, CA: Sage Publications. [ Google Scholar ]
  • Ritchie J., Lewis J. (Eds.). (2003). Qualitative research practice: A guide for social science students and researchers . Thousand Oaks, CA: Sage Publications. [ Google Scholar ]
  • Saftner M. A., Martyn K. K., Momper S. L., Loveland-Cherry C. J., Low L. K. (2015). Urban American Indian adolescent girls framing sexual risk behavior . Journal of Transcultural Nursing , 26 , 365–375. doi:10.1177/1043659614524789. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Saldaña J. (2012). The coding manual for qualitative researchers . Thousand Oaks, CA: Sage Publications. [ Google Scholar ]
  • Sandelowski M. (1993a). Rigor or rigor mortis: The problem of rigor in qualitative research revisited . Advances in Nursing Science , 16 , 1–8. [ PubMed ] [ Google Scholar ]
  • Sandelowski M. (1993b). Theory unmasked: The uses and guises of theory in qualitative research . Research in Nursing & Health , 16 , 213–218. doi:10.1002/nur.4770160308. [ PubMed ] [ Google Scholar ]
  • Sandelowski M. (1994). The use of quotes in qualitative research . Research in Nursing and Health , 17 , 479–482. [ PubMed ] [ Google Scholar ]
  • Sandelowski M. (1995). Sample size in qualitative research . Research in Nursing and Health , 18 , 179–183. [ PubMed ] [ Google Scholar ]
  • Sandelowski M. (1998). Writing a good read: Strategies for re-presenting qualitative data . Research in Nursing and Health , 21 , 375–382. doi:10.1016/s1361‐3111(98)80052‐6. [ PubMed ] [ Google Scholar ]
  • Sandelowski M. (2001). Real qualitative researchers do not count: The use of numbers in qualitative research . Research in Nursing and Health , 24 , 230–240. [ PubMed ] [ Google Scholar ]
  • Sandelowski M. (2010). What’s in a name? Qualitative description revisited . Research in Nursing and Health , 33 , 77–84. doi:10.1002/nur.20362.. [ PubMed ] [ Google Scholar ]
  • Sandelowski M., Barroso J. (2002). Finding the findings in qualitative studies . Journal of Nursing Scholarship , 34 , 213–219. [ PubMed ] [ Google Scholar ]
  • Schwartz L. A., Tuchman L. K., Hobbie W. L., Ginsberg J. P. (2011). A social-ecological model of readiness for transition to adult-oriented care for adolescents and young adults with chronic health conditions . Child: Care, Health, and Development , 37 , 883–895. doi:10.1111/j.1365‐2214.2011.01282.x. [ PubMed ] [ Google Scholar ]
  • Silver C., Lewins A. (2014). Using software in qualitative research: A step-by-step guide (2nd ed.). London: Sage Publications. [ Google Scholar ]
  • Steele R. G., Wu Y. P., Jensen C. D., Pankey S., Davis A. M., Aylward B. S. (2011). School nurses’ perceived barriers to discussing weight with children and their families: A qualitative approach . Journal of School Health , 81 , 128–137. doi:10.1111/j.1746‐1561.2010.00571.x. [ PubMed ] [ Google Scholar ]
  • Thompson D. (2014). Talk to me, please!: The importance of qualitative research to games for health . Games for Health: Research, Development, and Clinical Applications , 3 , 117–118. doi:10.1089/g4h.2014.0023. [ PubMed ] [ Google Scholar ]
  • Thompson D., Baranowski T., Buday R., Baranowski J., Juliano M., Frazior M., Wilsdon J., Jago R. (2007). In pursuit of change: Youth response to intensive goal setting embedded in a serious video game . Journal of Diabetes Science and Technology , 1 , 907–917. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Thompson D., Bhatt R., Watson K. (2013). Physical activity problem-solving inventory for adolescents: Development and initial validation . Pediatric Exercise Science , 25 , 448–467. [ PubMed ] [ Google Scholar ]
  • Tobin G. A., Begley C. M. (2004). Methodological rigour within a qualitative framework . Journal of Advanced Nursing , 48 , 388–396. doi:10.1111/j.1365‐2648.2004.03207.x. [ PubMed ] [ Google Scholar ]
  • Tong A., Sainsbury P., Craig J. (2007). Consolidated criteria for reporting qualitative research (COREQ): A 32-item checklist for interviews and focus groups . International Journal for Quality in Health Care , 19 , 349–357. doi:10.1093/intqhc/mzm042. [ PubMed ] [ Google Scholar ]
  • Tuckett A. G. (2004). Qualitative research sampling: The very real complexities . Nurse Researcher , 12 , 47–61. doi:10.7748/nr2004.07.12.1.47.c5930. [ PubMed ] [ Google Scholar ]
  • Valenzuela J. M., Buchanan C. L., Radcliffe J., Ambrose C., Hawkins L. A., Tanney M., Rudy B. J. (2011). Transition to adult services among behaviorally infected adolescents with HIV—a qualitative study . Journal of Pediatric Psychology , 36 , 134–140. doi:10.1093/jpepsy/jsp051. [ PubMed ] [ Google Scholar ]
  • West C. H., Bell J. M., Woodgate R. L., Moules N. J. (2015). Waiting to return to normal: An exploration of family systems intervention in childhood cancer . Journal of Family Nursing , 21 , 261–294. doi:10.1177/1074840715576795. [ PubMed ] [ Google Scholar ]
  • Whittemore R., Chase S. K., Mandle C. L. (2001). Validity in qualitative research . Qualitative Health Research , 11 , 522–537. doi:10.1177/104973201129119299. [ PubMed ] [ Google Scholar ]
  • Yin R. K. (2015). Qualitative research from start to finish (2nd ed.). New York, NY: Guilford Press. [ Google Scholar ]

Turnitin’s AI detector capabilities

Rapidly innovating to uphold academic integrity

Identify when AI writing tools such as ChatGPT have been used in students’ submissions.

AI writing detection is available to customers using Turnitin Feedback Studio (TFS), TFS with Originality, Turnitin Originality, Turnitin Similarity, Simcheck, Originality Check, and Originality Check+.

From January 2024, the AI detector will only be available to Turnitin customers when licensing Originality with their existing product.

AI Writing Detection by Turnitin Originality

Turnitin's AI detector

Turnitin's AI detector is specialized for student writing and is highly proficient in distinguishing between AI and human-written content.

writing a report quantitative

Trouble viewing? View the video on YouTube or adjust your cookie preferences .

Why choose Turnitin’s AI detector?

Turnitin’s AI writing detection capabilities have been enabled by 98% of our customers. It has also been independently shown to have high effectiveness in correctly identifying AI-generated content, when compared to other commercially available detectors.

Our AI detection technology is highly proficient in distinguishing AI written content from human-written content specifically for student writing, given our 25 years of experience in understanding and safeguarding academic writing.

The AI detector is fully integrated into the Similarity Report, providing customers with a seamless experience. It’s also available via your learning management system.

Academic integrity in the age of AI writing.

88 million papers processed for AI detection since launch in April 2023

Of these, 2.7 million flagged with 80% or more AI-written content

9.1 million have over 20% AI-written content Data as of August 31, 2023

ISTELive 2023

Turnitin’s AI detection feature named best in show by Tech & Learning

Turnitin’s AI detection feature named best in show by Tech & Learning

Biden signs AI executive order, the most expansive regulatory attempt yet

The action represents the u.s. government’s most significant effort to date to grapple with the technology.

President Biden signed a sweeping artificial intelligence executive order Monday, wielding the force of agencies across the federal government and invoking broad emergency powers to harness the potential and tackle the risks of what he called the “most consequential technology of our time.”

The sprawling effort marks the U.S. government’s most ambitious attempt to spur innovation and address concerns that the burgeoning technology could exacerbate bias, displace workers and undermine national security.

“One thing is clear: To realize the promise of AI and avoid the risk, we need to govern this technology,” Biden said during a White House address ahead of the signing Monday, calling the order the “most significant action any government anywhere in the world has ever taken on AI safety, security and trust.”

The order arrives as policymakers and regulators globally consider new measures to oversee and bolster the technology’s deployment, but also as attempts to pass comprehensive AI legislation in Congress remain in their infancy , limiting federal government leaders to enforcing existing protections and following executive action.

The order tackles a broad array of issues, placing new safety obligations on AI developers and calling on a slew of federal agencies to mitigate the technology’s risks while evaluating their own use of the tools, according to a summary provided by the White House.

The order requires that companies building the most advanced AI systems perform safety tests, a practice called “red teaming,” and notify the government of the results before rolling out their products. The order uses the Defense Production Act — a 1950 law that has been leveraged in recent crises including the coronavirus pandemic and the baby formula shortage — to require that companies share red-teaming results with the government.

Biden said the powers are typically reserved for “the most urgent moments” such as times of war and that he planned to use the “same authority to make companies prove that their most powerful systems are safe before allowing them to be used.”

The order harnesses federal purchasing power, directing the government to use risk management practices when using AI that has the potential to impact people’s rights or safety, according to a draft of the order viewed by The Washington Post. Agencies will be required to continuously monitor and evaluate deployed AI, according to the draft.

The order also directs the government to develop standards for companies to label AI-generated content, often referred to as watermarking, and calls on various agencies to grapple with how the technology could disrupt sectors including education, health services and defense.

White House to unveil sweeping artificial intelligence order tackling immigration and safety

The order comes amid a flurry of efforts to craft new laws, conduct consumer protection probes and collaborate with international regulators to curb the risks of AI. The action will have broad implications for almost every agency within the federal government, along with a host of Silicon Valley companies racing to build advanced AI systems.

Implementing the order marks a significant test for the Biden administration, which has struggled to live up to promises of crafting guardrails for powerful Silicon Valley companies. Biden and Vice President Harris have pledged since they were on the campaign trail to address competition in tech and the harms of social media, signaling an intention to take a tougher line against the tech industry than the Obama administration did.

But there are limits to how much the Biden administration can achieve without an act of Congress. Besides nominating key enforcers with a history of antagonism toward Silicon Valley, the White House has taken scant action on tech issues. Congress, meanwhile, hasn’t passed any major tech legislation, despite years of attempts to craft rules around privacy, online safety and emerging technologies.

In a sign of these restrictions, the order urges Congress to “pass bipartisan data privacy legislation to protect all Americans, especially kids,” according to the White House summary — a move that serves as a tacit acknowledgment of Biden’s constraints.

“I can see the frustration in this [executive order] that a lot of this should be done by Congress but they’re not doing anything,” said Ryan Calo, a law professor specializing in technology and AI at the University of Washington.

It’s unclear how deeply the order will affect the private sector, given its focus on federal agencies and “narrow circumstances” pertaining to national security matters, Calo added.

A senior Biden administration official, who briefed reporters on the condition of anonymity ahead of the order’s unveiling, said that because they set a “very high threshold” for which models are covered, the safety testing requirements probably “will not catch any system currently on the market.”

“This is primarily a forward-looking action for the next generation of models,” the official said.

Schumer launches ‘all hands on deck’ push to regulate AI

“This executive order represents bold action, but we still need Congress to act,” Biden said Monday.

Senate Majority Leader Charles E. Schumer (D-N.Y.), who attended the signing, and White House Office of Science and Technology Policy Director Arati Prabhakar both said at a Washington Post Live event last week that Congress has a role to play in crafting AI legislation too.

“There’s probably a limit to what you can do by executive order,” Schumer said. “They are concerned, and they’re doing a lot regulatorily, but everyone admits the only real answer is legislative.”

Schumer is leading a bipartisan group of lawmakers focused on crafting AI legislation, but they are likely months away from unveiling a proposal. He is expected to host a pair of AI Insight Forums this week, which have gathered top industry executives, civil society leaders and prominent AI researchers for discussions about the need for federal AI guardrails as well as greater funding for research. Biden said he plans to meet with Schumer and other lawmakers to discuss AI legislation at the White House on Tuesday.

Rep. Zoe Lofgren (Calif.), the top Democrat on the House Committee on Science, Space and Technology, said that Congress will also need to “adequately fund our federal science agencies to be able to do the important research and standards development described in this executive order.”

The executive order directs multiple government agencies to ease barriers to high-skilled immigration, amid a global battle for AI talent. Silicon Valley executives for years have pressured Washington to take steps to improve the process for high-skilled immigrants, but experts say they hope Congress will follow the Biden administration’s lead and consider new immigration laws amid its debate over AI.

“This is perhaps the most significant action that will supercharge American competitiveness,” said Divyansh Kaushik, associate director for emerging technologies and national security at the Federation of American Scientists.

The Biden administration is acting as other governments around the world plow ahead with efforts to regulate advanced AI systems. The European Union is expected to reach a deal by the end of this year on its AI Act, a wide-ranging package that aims to protect consumers from potentially dangerous applications of AI. Meanwhile China has new regulations for generative AI systems, which attempt to boost the growth of the country’s generative AI tools while retaining a grip on what information the systems make available to the public.

On the same day of the executive order signing, the G-7 — which includes the United States, France, Germany, Italy, Japan, Britain and Canada, as well as the European Union — announced voluntary guidance for companies, called the International Code of Conduct for Organizations Developing Advanced AI Systems. The guidelines call on companies to conduct regular assessments of the risks of their models, and to devote attention to systems that could pose a threat to democratic values or society, such as by enabling the creation of biological or nuclear weapons.

The European Commission described the code as a “living document” that will be updated to respond to developments in the technology.

This flurry of activity has caused some lawmakers in Washington to worry that the United States has fallen behind other countries in setting new regulations for the technology.

White House, Schumer agree: Congress needs to act on AI, too

The executive order comes just days before Harris is expected to promote the United States’ vision for AI regulation at Britain’s AI Summit, a two-day event that will gather leaders from around the world to talk about how to respond to the most risky applications of the technology. The executive order signals that the Biden administration is taking a different approach than the United Kingdom, which to date has signaled a light-touch posture toward AI companies and is focusing its summit on long-term threats of AI, including the possibility that the technology overpowers humans.

“We intend that the actions we are taking domestically will serve as a model for international action,” Harris said ahead of the signing Monday.

Reggie Babin, a senior counsel focused on AI regulation at Akin Gump Strauss Hauer & Feld, said the executive order sends a “signal to the world” about U.S. priorities for reining in AI.

Until now, “a lot of people have seen the Americans as, I don’t want to say absent, but certainly not playing a central role in terms of laying out a clear vision of enforceable policy in the way that our status as a global leader might suggest that we should,” said Babin, who previously served as chief counsel to Schumer.

The Biden administration first announced it was working on the executive action in July, when it secured voluntary commitments from companies including OpenAI and Google to test their advanced models before they are released to the public and commit to sharing data about the safety of their systems.

  • Side Hustles
  • Power Players
  • Young Success
  • Save and Invest
  • Become Debt-Free
  • Land the Job
  • Closing the Gap
  • Science of Success
  • Pop Culture and Media
  • Psychology and Relationships
  • Health and Wellness
  • Real Estate
  • Most Popular

Related Stories

  • Leadership A key part of success lots of people   overlook, from 34-year-old Kickstarter CEO
  • Success 5 tips to build a lucrative social media   career, from an expert creators coach
  • Health and Wellness Happiness expert: Make this small   change to be happier today
  • Work CEO shares No.1 tip for finding a mentor—and   the mistake to avoid on your search
  • Success Arnold Schwarzenegger: The No. 1   skill successful people have

Panera Bread founder credits his $7.5 billion success to a simple tactic: Writing his own obituary

thumbnail

Ron Shaich spent roughly two decades building Panera Bread into nationwide behemoth, ultimately selling it for $7.5 billion in 2017.

The Panera co-founder, 69, credits that success to a simple practice, which he did — and still does — once per year, he tells CNBC Make It: writing a "pre-mortem."

Between Christmas and his December 30 birthday, Shaich dedicates a day or two to visualize himself at the end of his life, he says. He writes a news story, obituary or journal entry from that hypothetical future, looking back on his own endeavors, listing only the most important accomplishments.

Then, he maps out how to get from his current state to that future version of himself. He starts at the end and plans backwards, and when he reaches the present day, he outlines specific projects he can undertake within the following year.

Don't miss: Reddit co-founder Alexis Ohanian: If you want more success than most people, trick your brain into the 'surfer mindset'

The practice helps Shaich combat the tendency to procrastinate, he says: "I don't want to worry about a heart attack on the way to the hospital, when it's too late to. If I want to build a business, I'd better understand what is going to matter [down the line] and make sure that I get it done now. Same thing applies to your own life."

Here's why it works, and who might find it particularly useful, according to Shaich.

A useful practice for anyone chasing 'deeper' success in life

In his recently published memoir , "Know What Matters: Lessons from a Lifetime of Transformations," Shaich references the work of psychologist Gary Klein.

Klein's pre-mortem technique involves visualizing a future where all your plans and aspirations have failed, and thinking of reasons why that might have happened. Shaich developed his own version after his parents passed away, particularly after he watched his father look back on his accomplishments and voice regrets, he says.

People often get blinded by "sideways" definitions of success, says Shaich — getting caught up by trying to match other people's achievements, rather than concentrating on your own future. A pre-mortem can help keep you focused on your own goals.

He particularly recommends the practice for people early in their careers, who haven't quite secured their dream job yet. Climbing the ranks at your current employer isn't necessarily the same as working toward your long-term goals, he notes.

"Your pre-mortem is not about making a few dollars more," says Shaich. "But it's really about the deeper issues: What is it you are trying to create with your life? What are you going to respect? You should be rigid in your vision, yet flexible in execution."

The rationale for 'backward planning'

The "backward planning" part of Shaich's pre-mortems is a crucial element, psychologists say.

Starting with your end goal and working backwards is a more effective way of achieving your goals than simply building a to-do list and hoping you'll eventually end up in the right place, according to a 2017 study published in the academic journal Psychological Science.

In that sense, Shaich's pre-mortems are similar to the practice of "anti-goals," popularized in a 2017 Medium post written by Andrew Wilkinson , co-founder of investment firm Tiny.

When building Tiny, Wilkinson and his co-founder decided to focus on what they didn't want their average workday to look like. They didn't want to feel exhausted at the end of each day, for example, so they established an anti-goal of never scheduling early morning meetings.

"Of course, we still have the odd unavoidable crappy day, but these simple Anti-Goals have made our lives immeasurably better by setting an Anti-Goal instead of a goal," Wilkinson wrote. "Try it sometime, it's insanely simple and strangely powerful."

Shaich agrees: Planning backwards can help you create a more specific, intentional approach. If life is a road trip, picking a destination helps you figure out which way to drive, he says.

"You should have a conversation with yourself about what you are going to feel good about, and respect, in these primary areas of your life down the road," Shaich says. "We are not yet figuring out how to get there. But we're just asking, 'Where are we headed?' first, so that we keep it in mind. And then from there, break it down."

DON'T MISS: Want to be smarter and more successful with your money, work & life?  Sign up for our new newsletter!

Get CNBC's free Warren Buffett Guide to Investing , which distills the billionaire's No. 1 best piece of advice for regular investors, do's and don'ts, and three key investing principles into a clear and simple guidebook.

Making $210K working at a hospital—without med school

writing a report quantitative

An official website of the United States government

Here's how you know

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS A lock ( Lock Locked padlock ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Visit our COVID-19 page for the latest updates on the pandemic response.

Home

The Right to Criminal Legal Defense in Maine

The Maine Advisory Committee to the U.S. Commission on Civil Rights submits this report regarding indigent legal services in Maine. The Committee submits this report as part of its responsibility to study and report on civil rights issues in the state. The contents of this report are primarily based on testimony the Committee heard during public meetings held via videoconference on October 20, 2022; November 15, 2022; and December 15, 2022. The Committee also includes related testimony submitted in writing during the relevant period of public comment.

This report begins with a brief background of the issues to be considered by the Committee. It then presents primary findings as they emerged from this testimony, as well as recommendations for addressing areas of civil rights concerns. This report is intended to focus on civil rights concerns regarding the right to legal defense for indigent persons. While additional important topics may have surfaced throughout the Committee’s inquiry, those matters that are outside the scope of this specific civil rights mandate are left for another discussion.

IMAGES

  1. FREE 10+ Quantitative Research Report Templates in MS Word

    writing a report quantitative

  2. FREE 9+ Quantitative Research Templates in PDF

    writing a report quantitative

  3. Quantitative Report Analysis

    writing a report quantitative

  4. 7+ Apa Research Proposal Templates

    writing a report quantitative

  5. Quantitative Analysis Report Example

    writing a report quantitative

  6. Sample Of Quantitative Research Paper Chapter 3

    writing a report quantitative

VIDEO

  1. Method Of Research Report

  2. Types of report

  3. Steps of Quantitative Analysis

  4. How Can I Write My Quantitative Research Paper Effectively?

  5. How to Write Report Writing || REPORT WRITING SELECTION QUESTION ANSWER 2023 || SA2 EXAM 10TH CLASS

  6. JSCC Quantitative Disclosures Q1 2023

COMMENTS

  1. 16. Reporting quantitative results

    Reporting quantitative results Chapter outline Reporting quantitative results (8 minute read time) Content warning: Brief discussion of violence against women. 16.1 Reporting quantitative results Learning Objectives Learners will be able to… Execute a quantitative research report using key elements for accuracy and openness

  2. Quantitative Methods

    Things to keep in mind when reporting the results of a study using quantitative methods: Explain the data collected and their statistical treatment as well as all relevant results in relation to the research problem you are investigating. Interpretation of results is not appropriate in this section.

  3. A Practical Guide to Writing Quantitative and Qualitative Research

    Scientific research is usually initiated by posing evidenced-based research questions which are then explicitly restated as hypotheses. 1, 2 The hypotheses provide directions to guide the study, solutions, explanations, and expected results. 3, 4 Both research questions and hypotheses are essentially formulated based on conventional theories and...

  4. What Is Quantitative Research?

    Quantitative research is the process of collecting and analyzing numerical data. It can be used to find patterns and averages, make predictions, test causal relationships, and generalize results to wider populations.

  5. Reporting Research Results in APA Style

    Reporting Research Results in APA Style | Tips & Examples. Published on December 21, 2020 by Pritha Bhandari.Revised on July 9, 2022. The results section of a quantitative research paper is where you summarize your data and report the findings of any relevant statistical analyses.. The APA manual provides rigorous guidelines for what to report in quantitative research papers in the fields of ...

  6. Writing Quantitative Research Studies

    1 Introduction Research and scientific enquiry forms the basis for generating evidence pertaining to a specific question of interest. Quantitative data collection and analysis guided by statistical principles are common methods of choice to address key questions related to studying social and health phenomenon at the population level.

  7. Quantitative research design (JARS-Quant)

    Quantitative Research Design (JARS-Quant) The current JARS-Quant guidelines, released in 2018, expand and revise the types of research methodologies covered in the original JARS, which were published in 2008. JARS-Quant includes guidelines for manuscripts that report. Primary quantitative research. Experimental designs.

  8. Techniques for Reporting Quantitative Data

    A rough sequence of steps for writing a quantitative research report describes in this section: 1. Specify a summary or abstract of the report to give a quick picture of the research article, thesis, review paper, conference proceeding, or in-depth analysis of a particular subject. 2. Define the research problem and discuss the methodology ...

  9. How to Write a Results Section

    Revised on July 18, 2023. A results section is where you report the main findings of the data collection and analysis you conducted for your thesis or dissertation. You should report all relevant results concisely and objectively, in a logical order.

  10. Journal Article Reporting Standards (JARS)

    Mixed methods research ( Jars -Mixed) Additionally, the APA Style Journal Article Reporting Standards for Race, Ethnicity, and Culture ( Jars - Rec) provide guidance on how to discuss race, ethnicity, and culture in scientific manuscripts. Jars - Rec should be applied to all research, whether it is quantitative, qualitative, or mixed methods.

  11. Teaching Writing with Quantitative Data

    Routledge eBook Edition, 2003. Students in most fields will need to write with numbers, but the ways that quantitative data are used vary across courses and curricula. In experimental and scientific courses, students may be taking measurements and recording outcomes of tests. In social science courses, students will interpret statistical data ...

  12. 16. Reporting quantitative results

    Reporting quantitative results Chapter Outline Additional resources for quantitative data analysis (3 minute read) Reporting quantitative results (8 minute read) Content warning: examples in this chapter contain a brief discussion of violence against women. 16.1 Additional resources for quantitative data analysis Learning Objectives

  13. Quantitative Research Report

    1. Business Research Report Template 2. Market Research Report Template 9+ Quantitative Research Report Templates In reporting quantitative investigations, researchers analyze and condense numerical findings into graphs, charts, tables, and other illustrative representations of data for easier reading consumption.

  14. How to Write a Report: A Guide

    Types of reports There are a few different types of reports, depending on the purpose and to whom you present your report. Here's a quick list of the common types of reports: Academic report: Tests a student's comprehension of the subject matter, such as book reports, reports on historical events, and biographies

  15. Guidelines for Reporting Quantitative Methods and Results in Primary

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed.

  16. Writing Quantitative Research Reports

    The options for interpreting, synthesizing and reporting quantitative data are taught here in a fun, practical way. In this 4-part program, Instructor Kathryn Korostoff teaches students how to interpret quantitative data in order to address project objectives, and how to report the findings using various text, visual display and even multimedia ...

  17. Improving quantitative writing one sentence at a time

    Writing lab reports that contain the elements of a research article is a widely used method to help students develop critical thinking and quantitative reasoning skills. In our introductory, lab-intensive Cell and Molecular Biology course, we focus on helping students develop the "results" section of their lab report.

  18. (PDF) Research Methodology WRITING A RESEARCH REPORT

    Quantitative research is a type of systematic investigation that pays attention to numerical or statistical values in a bid to find answers to research questions. In this type of research...

  19. How to Write a Quantitative Analysis Report

    How to Write a Quantitative Analysis Report Editorial Team Home » The Rewrite A quantitative analysis can give people the necessary information to make decisions about policy and planning for a program or organization. A good quantitative analysis leaves no questions about the quality of data and the authority of the conclusions.

  20. Quantitative Reports

    For quantitative labs the report includes plots of data, spreadsheet analysis, and a written conclusion. The specific format for Qual reports is given on another page. The format for Quantitative Reports is given below: A cover page that includes the title of the experiment, your name, your instructor's name, lab section, and the date submitted.

  21. REPORT WRITING OF QUALITATIVE AND QUANTATIVE RESEARCH

    Quantitative research, aligned with the postpositivist world view, has heralded the scientific method, thus, pertaining to promote accuracy, credibility, and validity. However, quantitative methodologies often align with qualitative research, lending it the stamp of scientific proof.

  22. Qualitative Research Resources: Writing Up Your Research

    How to search for and evaluate qualitative research, integrate qualitative research into systematic reviews, report/publish qualitative research. Includes some Mixed Methods resources. Created by Health Science Librarians ... Writing qualitative research for quantitative readers. Qualitative Health Research, 12(10),1427-1439. Hunt, Brandon.

  23. Commentary: Writing and Evaluating Qualitative Research Reports

    The results section of a qualitative research report is likely to contain more material than customary in quantitative research reports. Findings in a qualitative research paper typically include researcher interpretations of the data as well as data exemplars and the logic that led to researcher interpretations ( Sandelowski & Barroso, 2002 ).

  24. AI Writing Detection

    THE Journal. In this episode of THE Journal Insider podcast, host and THEJournal.com editor Kristal Kuykendall welcomes two former teachers who have been working on AI writing tools at Turnitin, a plagiarism-detection software used by thousands of K-12 schools and institutions of higher education.

  25. AI Detector

    AI writing detection is available to customers using Turnitin Feedback Studio (TFS), TFS with Originality, Turnitin Originality, Turnitin Similarity, Simcheck, Originality Check, and Originality Check+. From January 2024, the AI detector will only be available to Turnitin customers when licensing Originality with their existing product.

  26. Biden signs AI executive order, the most ambitious U.S. regulation yet

    President Biden signed a sweeping artificial intelligence executive order Monday, wielding the force of agencies across the federal government and invoking broad emergency powers to harness the ...

  27. Why Panera Bread founder Ron Shaich writes his own obituary annually

    Panera Bread founder credits his $7.5 billion success to a simple tactic: Writing his own obituary. Panera Bread founder Ron Shaich. Ron Shaich spent roughly two decades building Panera Bread into ...

  28. The Right to Criminal Legal Defense in Maine

    The Maine Advisory Committee to the U.S. Commission on Civil Rights submits this report regarding indigent legal services in Maine. The Committee submits this report as part of its responsibility to study and report on civil rights issues in the state. The contents of this report are primarily based on testimony the Committee heard during public meetings held via videoconference on October 20 ...