## Introduction

Whenever a statistical test concludes that a relationship is significant, when, in reality, there is no relationship, a *false discovery* has been made. When multiple tests are conducted this leads to a problem known as the multiple testing problem (also known as the *multiple comparisons problem*, or the post hoc testing problem, data dredging, and sometimes, data mining), whereby the more tests that are conducted, the more false discoveries that are made.

The *multiple comparison problem* is more likely to arise in situations where there are many comparisons (tests) run in a single table, such as tracking surveys with tables that include many individual waves of data. If the significance testing results in a table change after adding a new wave of data, or the results in the table differ from what you expect, examine the *multiple comparison correction *approach you use and determine if it's appropriate for your analysis.

Multiple comparison corrections attempt to fix this problem. The basic way that they work is that they require results to have smaller p-Values in order to be classified as significant.

This article describes how to go from a table showing significant differences at a 95% confidence level:

To a table showing significant differences at a 95% confidence level AND with *false discovery rate* correction applied:

## Method

To apply multiple comparison correction to Exception Tests:

- Right-click the table and select
**Table Options > Statistical Assumptions > Exception Tests** - From the
**Multiple comparison correction**menu, select**False Discovery Rate (FDR)** - Click
**OK**The results are as follows:

7. Select **Apply to Selection **to apply it to just this table or **Apply as Default** to make it the default for all crosstabs with date variables in the document.

The results are as follows:

### Restore all of the fields to their default values

- Click the
**Restore**button

## Next

How to Show Column Comparisons to the Right of Values in a Table

How to Compare Significant Differences Between Columns

How to Do Planned ANOVA-Type Tests

## Comments

0 comments

Article is closed for comments.