Analysis Page For Research Paper Called

Writing Research Papers

Writing is easy. All you do is stare at a blank sheet of paper until drops of blood form on your forehead. --- Gene Fowler

A major goal of this course is the development of effective technical writing skills. To help you become an accomplished writer, you will prepare several research papers based upon the studies completed in lab. Our research papers are not typical "lab reports." In a teaching lab a lab report might be nothing more than answers to a set of questions. Such an assignment hardly represents the kind of writing you might be doing in your eventual career.

Written and oral communications skills are probably the most universal qualities sought by graduate and professional schools as well as by employers. You alone are responsible for developing such skills to a high level.

Resources for learning technical writing

Before you begin your first writing assignment, please consult all of the following resources, in order to gain the most benefit from the experience.

  • General form of a typical research article
  • Specific guidelines (if any) for the assignment – see the writeups on individual lab studies
  • McMillan, VE. "Writing Papers in the Biological Sciences, Third Ed." New York: Bedford/St. Martin's, 2001. ISBN 0-312-25857-7 (REQUIRED for Bioc 211, 311, recommended for other science courses that include writing)
  • Writing portfolio examples (pdf)

As you polish up your writing skills please make use of the following resources

For Biosciences majors the general guidelines apply to future course work, as can be seen by examining the guidelines for the advanced experimental sciences research paper (Bioc 311).

General form of a research paper

An objective of organizing a research paper is to allow people to read your work selectively. When I research a topic, I may be interested in just the methods, a specific result, the interpretation, or perhaps I just want to see a summary of the paper to determine if it is relevant to my study. To this end, many journals require the following sections, submitted in the order listed, each section to start on a new page. There are variations of course. Some journals call for a combined results and discussion, for example, or include materials and methods after the body of the paper. The well known journal Science does away with separate sections altogether, except for the abstract.

Your papers are to adhere to the form and style required for the Journal of Biological Chemistry, requirements that are shared by many journals in the life sciences.

General style

Specific editorial requirements for submission of a manuscript will always supercede instructions in these general guidelines.

To make a paper readable

  • Print or type using a 12 point standard font, such as Times, Geneva, Bookman, Helvetica, etc.
  • Text should be double spaced on 8 1/2" x 11" paper with 1 inch margins, single sided
  • Number pages consecutively
  • Start each new section on a new page
  • Adhere to recommended page limits

Mistakes to avoid

  • Placing a heading at the bottom of a page with the following text on the next page (insert a page break!)
  • Dividing a table or figure - confine each figure/table to a single page
  • Submitting a paper with pages out of order

In all sections of your paper

  • Use normal prose including articles ("a", "the," etc.)
  • Stay focused on the research topic of the paper
  • Use paragraphs to separate each important point (except for the abstract)
  • Indent the first line of each paragraph
  • Present your points in logical order
  • Use present tense to report well accepted facts - for example, 'the grass is green'
  • Use past tense to describe specific results - for example, 'When weed killer was applied, the grass was brown'
  • Avoid informal wording, don't address the reader directly, and don't use jargon, slang terms, or superlatives
  • Avoid use of superfluous pictures - include only those figures necessary to presenting results

Title Page

Select an informative title as illustrated in the examples in your writing portfolio example package. Include the name(s) and address(es) of all authors, and date submitted. "Biology lab #1" would not be an informative title, for example.

Abstract

The summary should be two hundred words or less. See the examples in the writing portfolio package.

General intent

An abstract is a concise single paragraph summary of completed work or work in progress. In a minute or less a reader can learn the rationale behind the study, general approach to the problem, pertinent results, and important conclusions or new questions.

Writing an abstract

Write your summary after the rest of the paper is completed. After all, how can you summarize something that is not yet written? Economy of words is important throughout any paper, but especially in an abstract. However, use complete sentences and do not sacrifice readability for brevity. You can keep it concise by wording sentences so that they serve more than one purpose. For example, "In order to learn the role of protein synthesis in early development of the sea urchin, newly fertilized embryos were pulse-labeled with tritiated leucine, to provide a time course of changes in synthetic rate, as measured by total counts per minute (cpm)." This sentence provides the overall question, methods, and type of analysis, all in one sentence. The writer can now go directly to summarizing the results.

Summarize the study, including the following elements in any abstract. Try to keep the first two items to no more than one sentence each.

  • Purpose of the study - hypothesis, overall question, objective
  • Model organism or system and brief description of the experiment
  • Results, including specific data - if the results are quantitative in nature, report quantitative data; results of any statistical analysis shoud be reported
  • Important conclusions or questions that follow from the experiment(s)

Style:

  • Single paragraph, and concise
  • As a summary of work done, it is always written in past tense
  • An abstract should stand on its own, and not refer to any other part of the paper such as a figure or table
  • Focus on summarizing results - limit background information to a sentence or two, if absolutely necessary
  • What you report in an abstract must be consistent with what you reported in the paper
  • Corrrect spelling, clarity of sentences and phrases, and proper reporting of quantities (proper units, significant figures) are just as important in an abstract as they are anywhere else

Introduction

Your introductions should not exceed two pages (double spaced, typed). See the examples in the writing portfolio package.

General intent

The purpose of an introduction is to aquaint the reader with the rationale behind the work, with the intention of defending it. It places your work in a theoretical context, and enables the reader to understand and appreciate your objectives.

Writing an introduction

The abstract is the only text in a research paper to be written without using paragraphs in order to separate major points. Approaches vary widely, however for our studies the following approach can produce an effective introduction.

  • Describe the importance (significance) of the study - why was this worth doing in the first place? Provide a broad context.
  • Defend the model - why did you use this particular organism or system? What are its advantages? You might comment on its suitability from a theoretical point of view as well as indicate practical reasons for using it.
  • Provide a rationale. State your specific hypothesis(es) or objective(s), and describe the reasoning that led you to select them.
  • Very briefy describe the experimental design and how it accomplished the stated objectives.

Style:

  • Use past tense except when referring to established facts. After all, the paper will be submitted after all of the work is completed.
  • Organize your ideas, making one major point with each paragraph. If you make the four points listed above, you will need a minimum of four paragraphs.
  • Present background information only as needed in order support a position. The reader does not want to read everything you know about a subject.
  • State the hypothesis/objective precisely - do not oversimplify.
  • As always, pay attention to spelling, clarity and appropriateness of sentences and phrases.

Materials and Methods

There is no specific page limit, but a key concept is to keep this section as concise as you possibly can. People will want to read this material selectively. The reader may only be interested in one formula or part of a procedure. Materials and methods may be reported under separate subheadings within this section or can be incorporated together.

General intent

This should be the easiest section to write, but many students misunderstand the purpose. The objective is to document all specialized materials and general procedures, so that another individual may use some or all of the methods in another study or judge the scientific merit of your work. It is not to be a step by step description of everything you did, nor is a methods section a set of instructions. In particular, it is not supposed to tell a story. By the way, your notebook should contain all of the information that you need for this section.

Writing a materials and methods section

Materials:

  • Describe materials separately only if the study is so complicated that it saves space this way.
  • Include specialized chemicals, biological materials, and any equipment or supplies that are not commonly found in laboratories.
  • Do not include commonly found supplies such as test tubes, pipet tips, beakers, etc., or standard lab equipment such as centrifuges, spectrophotometers, pipettors, etc.
  • If use of a specific type of equipment, a specific enzyme, or a culture from a particular supplier is critical to the success of the experiment, then it and the source should be singled out, otherwise no.
  • Materials may be reported in a separate paragraph or else they may be identified along with your procedures.
  • In biosciences we frequently work with solutions - refer to them by name and describe completely, including concentrations of all reagents, and pH of aqueous solutions, solvent if non-aqueous.
Methods:
  • See the examples in the writing portfolio package
  • Report the methodology (not details of each procedure that employed the same methodology)
  • Describe the mehodology completely, including such specifics as temperatures, incubation times, etc.
  • To be concise, present methods under headings devoted to specific procedures or groups of procedures
  • Generalize - report how procedures were done, not how they were specifically performed on a particular day. For example, report "samples were diluted to a final concentration of 2 mg/ml protein;" don't report that "135 microliters of sample one was diluted with 330 microliters of buffer to make the protein concentration 2 mg/ml." Always think about what would be relevant to an investigator at another institution, working on his/her own project.
  • If well documented procedures were used, report the procedure by name, perhaps with reference, and that's all. For example, the Bradford assay is well known. You need not report the procedure in full - just that you used a Bradford assay to estimate protein concentration, and identify what you used as a standard. The same is true for the SDS-PAGE method, and many other well known procedures in biology and biochemistry.
Style:
  • It is awkward or impossible to use active voice when documenting methods without using first person, which would focus the reader's attention on the investigator rather than the work. Therefore when writing up the methods most authors use third person passive voice.
  • Use normal prose in this and in every other section of the paper – avoid informal lists, and use complete sentences.

What to avoid

  • Materials and methods are not a set of instructions.
  • Omit all explanatory information and background - save it for the discussion.
  • Omit information that is irrelevant to a third party, such as what color ice bucket you used, or which individual logged in the data.

Results

The page length of this section is set by the amount and types of data to be reported. Continue to be concise, using figures and tables, if appropriate, to present results most effectively. See recommendations for content, below.

General intent

The purpose of a results section is to present and illustrate your findings. Make this section a completely objective report of the results, and save all interpretation for the discussion.

Writing a results section

IMPORTANT: You must clearly distinguish material that would normally be included in a research article from any raw data or other appendix material that would not be published. In fact, such material should not be submitted at all unless requested by the instructor.

Content

  • Summarize your findings in text and illustrate them, if appropriate, with figures and tables.
  • In text, describe each of your results, pointing the reader to observations that are most relevant.
  • Provide a context, such as by describing the question that was addressed by making a particular observation.
  • Describe results of control experiments and include observations that are not presented in a formal figure or table, if appropriate.
  • Analyze your data, then prepare the analyzed (converted) data in the form of a figure (graph), table, or in text form.

What to avoid

  • Do not discuss or interpret your results, report background information, or attempt to explain anything.
  • Never include raw data or intermediate calculations in a research paper.
  • Do not present the same data more than once.
  • Text should complement any figures or tables, not repeat the same information.
  • Please do not confuse figures with tables - there is a difference.

Style

  • As always, use past tense when you refer to your results, and put everything in a logical order.
  • In text, refer to each figure as "figure 1," "figure 2," etc. ; number your tables as well (see the reference text for details)
  • Place figures and tables, properly numbered, in order at the end of the report (clearly distinguish them from any other material such as raw data, standard curves, etc.)
  • If you prefer, you may place your figures and tables appropriately within the text of your results section.

Figures and tables

  • Either place figures and tables within the text of the result, or include them in the back of the report (following Literature Cited) - do one or the other
  • If you place figures and tables at the end of the report, make sure they are clearly distinguished from any attached appendix materials, such as raw data
  • Regardless of placement, each figure must be numbered consecutively and complete with caption (caption goes under the figure)
  • Regardless of placement, each table must be titled, numbered consecutively and complete with heading (title with description goes above the table)
  • Each figure and table must be sufficiently complete that it could stand on its own, separate from text

Discussion

Journal guidelines vary. Space is so valuable in the Journal of Biological Chemistry, that authors are asked to restrict discussions to four pages or less, double spaced, typed. That works out to one printed page. While you are learning to write effectively, the limit will be extended to five typed pages. If you practice economy of words, that should be plenty of space within which to say all that you need to say.

General intent

The objective here is to provide an interpretation of your results and support for all of your conclusions, using evidence from your experiment and generally accepted knowledge, if appropriate. The significance of findings should be clearly described.

Writing a discussion

Interpret your data in the discussion in appropriate depth. This means that when you explain a phenomenon you must describe mechanisms that may account for the observation. If your results differ from your expectations, explain why that may have happened. If your results agree, then describe the theory that the evidence supported. It is never appropriate to simply state that the data agreed with expectations, and let it drop at that.

  • Decide if each hypothesis is supported, rejected, or if you cannot make a decision with confidence. Do not simply dismiss a study or part of a study as "inconclusive."
  • Research papers are not accepted if the work is incomplete. Draw what conclusions you can based upon the results that you have, and treat the study as a finished work
  • You may suggest future directions, such as how the experiment might be modified to accomplish another objective.
  • Explain all of your observations as much as possible, focusing on mechanisms.
  • Decide if the experimental design adequately addressed the hypothesis, and whether or not it was properly controlled.
  • Try to offer alternative explanations if reasonable alternatives exist.
  • One experiment will not answer an overall question, so keeping the big picture in mind, where do you go next? The best studies open up new avenues of research. What questions remain?
  • Recommendations for specific papers will provide additional suggestions.
Style:
  • When you refer to information, distinguish data generated by your own studies from published information or from information obtained from other students (verb tense is an important tool for accomplishing that purpose).
  • Refer to work done by specific individuals (including yourself) in past tense.
  • Refer to generally accepted facts and principles in present tense. For example, "Doofus, in a 1989 survey, found that anemia in basset hounds was correlated with advanced age. Anemia is a condition in which there is insufficient hemoglobin in the blood."

The biggest mistake that students make in discussions is to present a superficial interpretation that more or less re-states the results. It is necessary to suggest why results came out as they did, focusing on the mechanisms behind the observations.

Literature Cited

Please note that in the introductory laboratory course, you will not be required to properly document sources of all of your information. One reason is that your major source of information is this website, and websites are inappropriate as primary sources. Second, it is problematic to provide a hundred students with equal access to potential reference materials. You may nevertheless find outside sources, and you should cite any articles that the instructor provides or that you find for yourself.

List all literature cited in your paper, in alphabetical order, by first author. In a proper research paper, only primary literature is used (original research articles authored by the original investigators). Be cautious about using web sites as references - anyone can put just about anything on a web site, and you have no sure way of knowing if it is truth or fiction. If you are citing an on line journal, use the journal citation (name, volume, year, page numbers). Some of your papers may not require references, and if that is the case simply state that "no references were consulted."

Vietnamese translation: http://translate.coupofy.com/writing-research-papers/
Russian translation: http://blog.hightwall.com/reportform/

This glossary is intended to assist you in understanding commonly used terms and concepts when reading, interpreting, and evaluating scholarly research in the social sciences. Also included are general words and phrases defined within the context of how they apply to research in the social and behavioral sciences.


  • Acculturation -- refers to the process of adapting to another culture, particularly in reference to blending in with the majority population [e.g., an immigrant adopting American customs]. However, acculturation also implies that both cultures add something to one another, but still remain distinct groups unto themselves.
  • Accuracy -- a term used in survey research to refer to the match between the target population and the sample.
  • Affective Measures -- procedures or devices used to obtain quantified descriptions of an individual's feelings, emotional states, or dispositions.
  • Aggregate -- a total created from smaller units. For instance, the population of a county is an aggregate of the populations of the cities, rural areas, etc. that comprise the county. As a verb, it refers to total data from smaller units into a large unit.
  • Anonymity -- a research condition in which no one, including the researcher, knows the identities of research participants.
  • Baseline -- a control measurement carried out before an experimental treatment.
  • Behaviorism -- school of psychological thought concerned with the observable, tangible, objective facts of behavior, rather than with subjective phenomena such as thoughts, emotions, or impulses. Contemporary behaviorism also emphasizes the study of mental states such as feelings and fantasies to the extent that they can be directly observed and measured.
  • Beliefs -- ideas, doctrines, tenets, etc. that are accepted as true on grounds which are not immediately susceptible to rigorous proof.
  • Benchmarking -- systematically measuring and comparing the operations and outcomes of organizations, systems, processes, etc., against agreed upon "best-in-class" frames of reference.
  • Bias -- a loss of balance and accuracy in the use of research methods. It can appear in research via the sampling frame, random sampling, or non-response. It can also occur at other stages in research, such as while interviewing, in the design of questions, or in the way data are analyzed and presented. Bias means that the research findings will not be representative of, or generalizable to, a wider population.
  • Case Study -- the collection and presentation of detailed information about a particular participant or small group, frequently including data derived from the subjects themselves.
  • Causal Hypothesis -- a statement hypothesizing that the independent variable affects the dependent variable in some way.
  • Causal Relationship -- the relationship established that shows that an independent variable, and nothing else, causes a change in a dependent variable. It also establishes how much of a change is shown in the dependent variable.
  • Causality -- the relation between cause and effect.
  • Central Tendency -- any way of describing or characterizing typical, average, or common values in some distribution.
  • Chi-square Analysis -- a common non-parametric statistical test which compares an expected proportion or ratio to an actual proportion or ratio.
  • Claim -- a statement, similar to a hypothesis, which is made in response to the research question and that is affirmed with evidence based on research.
  • Classification -- ordering of related phenomena into categories, groups, or systems according to characteristics or attributes.
  • Cluster Analysis -- a method of statistical analysis where data that share a common trait are grouped together. The data is collected in a way that allows the data collector to group data according to certain characteristics.
  • Cohort Analysis -- group by group analytic treatment of individuals having a statistical factor in common to each group. Group members share a particular characteristic [e.g., born in a given year] or a common experience [e.g., entering a college at a given time].
  • Confidentiality -- a research condition in which no one except the researcher(s) knows the identities of the participants in a study. It refers to the treatment of information that a participant has disclosed to the researcher in a relationship of trust and with the expectation that it will not be revealed to others in ways that violate the original consent agreement, unless permission is granted by the participant.
  • Confirmability Objectivity -- the findings of the study could be confirmed by another person conducting the same study.
  • Construct -- refers to any of the following: something that exists theoretically but is not directly observable; a concept developed [constructed] for describing relations among phenomena or for other research purposes; or, a theoretical definition in which concepts are defined in terms of other concepts. For example, intelligence cannot be directly observed or measured; it is a construct.
  • Construct Validity -- seeks an agreement between a theoretical concept and a specific measuring device, such as observation.
  • Constructivism -- the idea that reality is socially constructed. It is the view that reality cannot be understood outside of the way humans interact and that the idea that knowledge is constructed, not discovered. Constructivists believe that learning is more active and self-directed than either behaviorism or cognitive theory would postulate.
  • Content Analysis -- the systematic, objective, and quantitative description of the manifest or latent content of print or nonprint communications.
  • Context Sensitivity -- awareness by a qualitative researcher of factors such as values and beliefs that influence cultural behaviors.
  • Control Group -- the group in an experimental design that receives either no treatment or a different treatment from the experimental group. This group can thus be compared to the experimental group.
  • Controlled Experiment -- an experimental design with two or more randomly selected groups [an experimental group and control group] in which the researcher controls or introduces the independent variable and measures the dependent variable at least two times [pre- and post-test measurements].
  • Correlation -- a common statistical analysis, usually abbreviated as r, that measures the degree of relationship between pairs of interval variables in a sample. The range of correlation is from -1.00 to zero to +1.00. Also, a non-cause and effect relationship between two variables.
  • Covariate -- a product of the correlation of two related variables times their standard deviations. Used in true experiments to measure the difference of treatment between them.
  • Credibility -- a researcher's ability to demonstrate that the object of a study is accurately identified and described based on the way in which the study was conducted.
  • Critical Theory -- an evaluative approach to social science research, associated with Germany's neo-Marxist “Frankfurt School,” that aims to criticize as well as analyze society, opposing the political orthodoxy of modern communism. Its goal is to promote human emancipatory forces and to expose ideas and systems that impede them.
  • Data -- factual information [as measurements or statistics] used as a basis for reasoning, discussion, or calculation.
  • Data Mining -- the process of analyzing data from different perspectives and summarizing it into useful information, often to discover patterns and/or systematic relationships among variables.
  • Data Quality -- this is the degree to which the collected data [results of measurement or observation] meet the standards of quality to be considered valid [trustworthy] and  reliable [dependable].
  • Deductive -- a form of reasoning in which conclusions are formulated about particulars from general or universal premises.
  • Dependability -- being able to account for changes in the design of the study and the changing conditions surrounding what was studied.
  • Dependent Variable -- a variable that varies due, at least in part, to the impact of the independent variable. In other words, its value “depends” on the value of the independent variable. For example, in the variables “gender” and “academic major,” academic major is the dependent variable, meaning that your major cannot determine whether you are male or female, but your gender might indirectly lead you to favor one major over another.
  • Deviation -- the distance between the mean and a particular data point in a given distribution.
  • Discourse Community -- a community of scholars and researchers in a given field who respond to and communicate to each other through published articles in the community's journals and presentations at conventions. All members of the discourse community adhere to certain conventions for the presentation of their theories and research.
  • Discrete Variable -- a variable that is measured solely in whole units, such as, gender and number of siblings.
  • Distribution -- the range of values of a particular variable.
  • Effect Size -- the amount of change in a dependent variable that can be attributed to manipulations of the independent variable. A large effect size exists when the value of the dependent variable is strongly influenced by the independent variable. It is the mean difference on a variable between experimental and control groups divided by the standard deviation on that variable of the pooled groups or of the control group alone.
  • Emancipatory Research -- research is conducted on and with people from marginalized groups or communities. It is led by a researcher or research team who is either an indigenous or external insider; is interpreted within intellectual frameworks of that group; and, is conducted largely for the purpose of empowering members of that community and improving services for them. It also engages members of the community as co-constructors or validators of knowledge.
  • Empirical Research -- the process of developing systematized knowledge gained from observations that are formulated to support insights and generalizations about the phenomena being researched.
  • Epistemology -- concerns knowledge construction; asks what constitutes knowledge and how knowledge is validated.
  • Ethnography -- method to study groups and/or cultures over a period of time. The goal of this type of research is to comprehend the particular group/culture through immersion into the culture or group. Research is completed through various methods but, since the researcher is immersed within the group for an extended period of time, more detailed information is usually collected during the research.
  • Expectancy Effect -- any unconscious or conscious cues that convey to the participant in a study how the researcher wants them to respond. Expecting someone to behave in a particular way has been shown to promote the expected behavior. Expectancy effects can be minimized by using standardized interactions with subjects, automated data-gathering methods, and double blind protocols.
  • External Validity -- the extent to which the results of a study are generalizable or transferable.
  • Factor Analysis -- a statistical test that explores relationships among data. The test explores which variables in a data set are most related to each other. In a carefully constructed survey, for example, factor analysis can yield information on patterns of responses, not simply data on a single response. Larger tendencies may then be interpreted, indicating behavior trends rather than simply responses to specific questions.
  • Field Studies -- academic or other investigative studies undertaken in a natural setting, rather than in laboratories, classrooms, or other structured environments.
  • Focus Groups -- small, roundtable discussion groups charged with examining specific topics or problems, including possible options or solutions. Focus groups usually consist of 4-12 participants, guided by moderators to keep the discussion flowing and to collect and report the results.
  • Framework -- the structure and support that may be used as both the launching point and the on-going guidelines for investigating a research problem.
  • Generalizability -- the extent to which research findings and conclusions conducted on a specific study to groups or situations can be applied to the population at large.
  • Grounded Theory -- practice of developing other theories that emerge from observing a group. Theories are grounded in the group's observable experiences, but researchers add their own insight into why those experiences exist.
  • Group Behavior -- behaviors of a group as a whole, as well as the behavior of an individual as influenced by his or her membership in a group.
  • Hypothesis -- a tentative explanation based on theory to predict a causal relationship between variables.
  • Independent Variable -- the conditions of an experiment that are systematically manipulated by the researcher. A variable that is not impacted by the dependent variable, and that itself impacts the dependent variable. In the earlier example of "gender" and "academic major," (see Dependent Variable) gender is the independent variable.
  • Individualism -- a theory or policy having primary regard for the liberty, rights, or independent actions of individuals.
  • Inductive -- a form of reasoning in which a generalized conclusion is formulated from particular instances.
  • Inductive Analysis -- a form of analysis based on inductive reasoning; a researcher using inductive analysis starts with answers, but formulates questions throughout the research process.
  • Insiderness -- a concept in qualitative research that refers to the degree to which a researcher has access to and an understanding of persons, places, or things within a group or community based on being a member of that group or community.
  • Internal Consistency -- the extent to which all questions or items assess the same characteristic, skill, or quality.
  • Internal Validity -- the rigor with which the study was conducted [e.g., the study's design, the care taken to conduct measurements, and decisions concerning what was and was not measured]. It is also the extent to which the designers of a study have taken into account alternative explanations for any causal relationships they explore. In studies that do not explore causal relationships, only the first of these definitions should be considered when assessing internal validity.
  • Life History -- a record of an event/events in a respondent's life told [written down, but increasingly audio or video recorded] by the respondent from his/her own perspective in his/her own words. A life history is different from a "research story" in that it covers a longer time span, perhaps a complete life, or a significant period in a life.
  • Margin of Error -- the permittable or acceptable deviation from the target or a specific value. The allowance for slight error or miscalculation or changing circumstances in a study.
  • Measurement -- process of obtaining a numerical description of the extent to which persons, organizations, or things possess specified characteristics.
  • Meta-Analysis -- an analysis combining the results of several studies that address a set of related hypotheses.
  • Methodology -- a theory or analysis of how research does and should proceed.
  • Methods -- systematic approaches to the conduct of an operation or process. It includes steps of procedure, application of techniques, systems of reasoning or analysis, and the modes of inquiry employed by a discipline.
  • Mixed-Methods -- a research approach that uses two or more methods from both the quantitative and qualitative research categories. It is also referred to as blended methods, combined methods, or methodological triangulation.
  • Modeling -- the creation of a physical or computer analogy to understand a particular phenomenon. Modeling helps in estimating the relative magnitude of various factors involved in a phenomenon. A successful model can be shown to account for unexpected behavior that has been observed, to predict certain behaviors, which can then be tested experimentally, and to demonstrate that a given theory cannot account for certain phenomenon.
  • Models -- representations of objects, principles, processes, or ideas often used for imitation or emulation.
  • Naturalistic Observation -- observation of behaviors and events in natural settings without experimental manipulation or other forms of interference.
  • Norm -- the norm in statistics is the average or usual performance. For example, students usually complete their high school graduation requirements when they are 18 years old. Even though some students graduate when they are younger or older, the norm is that any given student will graduate when he or she is 18 years old.
  • Null Hypothesis -- the proposition, to be tested statistically, that the experimental intervention has "no effect," meaning that the treatment and control groups will not differ as a result of the intervention. Investigators usually hope that the data will demonstrate some effect from the intervention, thus allowing the investigator to reject the null hypothesis.
  • Ontology -- a discipline of philosophy that explores the science of what is, the kinds and structures of objects, properties, events, processes, and relations in every area of reality.
  • Panel Study -- a longitudinal study in which a group of individuals is interviewed at intervals over a period of time.
  • Participant -- individuals whose physiological and/or behavioral characteristics and responses are the object of study in a research project.
  • Peer-Review -- the process in which the author of a book, article, or other type of publication submits his or her work to experts in the field for critical evaluation, usually prior to publication. This is standard procedure in publishing scholarly research.
  • Phenomenology -- a qualitative research approach concerned with understanding certain group behaviors from that group's point of view.
  • Philosophy -- critical examination of the grounds for fundamental beliefs and analysis of the basic concepts, doctrines, or practices that express such beliefs.
  • Phonology -- the study of the ways in which speech sounds form systems and patterns in language.
  • Policy -- governing principles that serve as guidelines or rules for decision making and action in a given area.
  • Policy Analysis -- systematic study of the nature, rationale, cost, impact, effectiveness, implications, etc., of existing or alternative policies, using the theories and methodologies of relevant social science disciplines.
  • Population -- the target group under investigation. The population is the entire set under consideration. Samples are drawn from populations.
  • Position Papers -- statements of official or organizational viewpoints, often recommending a particular course of action or response to a situation.
  • Positivism -- a doctrine in the philosophy of science, positivism argues that science can only deal with observable entities known directly to experience. The positivist aims to construct general laws, or theories, which express relationships between phenomena. Observation and experiment is used to show whether the phenomena fit the theory.
  • Predictive Measurement -- use of tests, inventories, or other measures to determine or estimate future events, conditions, outcomes, or trends.
  • Principal Investigator -- the scientist or scholar with primary responsibility for the design and conduct of a research project.
  • Probability -- the chance that a phenomenon will occur randomly. As a statistical measure, it is shown as p [the "p" factor].
  • Questionnaire -- structured sets of questions on specified subjects that are used to gather information, attitudes, or opinions.
  • Random Sampling -- a process used in research to draw a sample of a population strictly by chance, yielding no discernible pattern beyond chance. Random sampling can be accomplished by first numbering the population, then selecting the sample according to a table of random numbers or using a random-number computer generator. The sample is said to be random because there is no regular or discernible pattern or order. Random sample selection is used under the assumption that sufficiently large samples assigned randomly will exhibit a distribution comparable to that of the population from which the sample is drawn. The random assignment of participants increases the probability that differences observed between participant groups are the result of the experimental intervention.
  • Reliability -- the degree to which a measure yields consistent results. If the measuring instrument [e.g., survey] is reliable, then administering it to similar groups would yield similar results. Reliability is a prerequisite for validity. An unreliable indicator cannot produce trustworthy results.
  • Representative Sample -- sample in which the participants closely match the characteristics of the population, and thus, all segments of the population are represented in the sample. A representative sample allows results to be generalized from the sample to the population.
  • Rigor -- degree to which research methods are scrupulously and meticulously carried out in order to recognize important influences occurring in an experimental study.
  • Sample -- the population researched in a particular study. Usually, attempts are made to select a "sample population" that is considered representative of groups of people to whom results will be generalized or transferred. In studies that use inferential statistics to analyze results or which are designed to be generalizable, sample size is critical, generally the larger the number in the sample, the higher the likelihood of a representative distribution of the population.
  • Sampling Error -- the degree to which the results from the sample deviate from those that would be obtained from the entire population, because of random error in the selection of respondent and the corresponding reduction in reliability.
  • Saturation -- a situation in which data analysis begins to reveal repetition and redundancy and when new data tend to confirm existing findings rather than expand upon them.
  • Semantics -- the relationship between symbols and meaning in a linguistic system. Also, the cuing system that connects what is written in the text to what is stored in the reader's prior knowledge.
  • Social Theories -- theories about the structure, organization, and functioning of human societies.
  • Sociolinguistics -- the study of language in society and, more specifically, the study of language varieties, their functions, and their speakers.
  • Standard Deviation -- a measure of variation that indicates the typical distance between the scores of a distribution and the mean; it is determined by taking the square root of the average of the squared deviations in a given distribution. It can be used to indicate the proportion of data within certain ranges of scale values when the distribution conforms closely to the normal curve.
  • Statistical Analysis -- application of statistical processes and theory to the compilation, presentation, discussion, and interpretation of numerical data.
  • Statistical Bias -- characteristics of an experimental or sampling design, or the mathematical treatment of data, that systematically affects the results of a study so as to produce incorrect, unjustified, or inappropriate inferences or conclusions.
  • Statistical Significance -- the probability that the difference between the outcomes of the control and experimental group are great enough that it is unlikely due solely to chance. The probability that the null hypothesis can be rejected at a predetermined significance level [0.05 or 0.01].
  • Statistical Tests -- researchers use statistical tests to make quantitative decisions about whether a study's data indicate a significant effect from the intervention and allow the researcher to reject the null hypothesis. That is, statistical tests show whether the differences between the outcomes of the control and experimental groups are great enough to be statistically significant. If differences are found to be statistically significant, it means that the probability [likelihood] that these differences occurred solely due to chance is relatively low. Most researchers agree that a significance value of .05 or less [i.e., there is a 95% probability that the differences are real] sufficiently determines significance.
  • Subcultures -- ethnic, regional, economic, or social groups exhibiting characteristic patterns of behavior sufficient to distinguish them from the larger society to which they belong.
  • Testing -- the act of gathering and processing information about individuals' ability, skill, understanding, or knowledge under controlled conditions.
  • Theory -- a general explanation about a specific behavior or set of events that is based on known principles and serves to organize related events in a meaningful way. A theory is not as specific as a hypothesis.
  • Treatment -- the stimulus given to a dependent variable.
  • Trend Samples -- method of sampling different groups of people at different points in time from the same population.
  • Triangulation -- a multi-method or pluralistic approach, using different methods in order to focus on the research topic from different viewpoints and to produce a multi-faceted set of data. Also used to check the validity of findings from any one method.
  • Unit of Analysis -- the basic observable entity or phenomenon being analyzed by a study and for which data are collected in the form of variables.
  • Validity -- the degree to which a study accurately reflects or assesses the specific concept that the researcher is attempting to measure. A method can be reliable, consistently measuring the same thing, but not valid.
  • Variable -- any characteristic or trait that can vary from one person to another [race, gender, academic major] or for one person over time [age, political beliefs].
  • Weighted Scores -- scores in which the components are modified by different multipliers to reflect their relative importance.
  • White Paper -- an authoritative report that often states the position or philosophy about a social, political, or other subject, or a general explanation of an architecture, framework, or product technology written by a group of researchers. A white paper seeks to contain unbiased information and analysis regarding a business or policy problem that the researchers may be facing.

Free Social Science Dictionary. Socialsciencedictionary.com [2008]. Glossary. Institutional Review Board. Colorado College; Glossary of Key Terms. Writing@CSU. Colorado State University; Glossary A-Z. Education.com; Glossary of Research Terms. Research Mindedness Virtual Learning Resource. Centre for Human Servive Technology. University of Southampton; Jupp, Victor. The SAGE Dictionary of Social and Cultural Research Methods. London: Sage, 2006.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *