RobertNiles.com
About Robert
Reporting Help
Finding Data on the Internet
Questions for Candidates
Stats Lessons
Mean
Median
Percent
Per capita
Standard Deviation
Margin of Error
Data Analysis
Sample Sizes
Stats Tests
Help Board
Bookstore


Statistics for the Utterly Confused

This book offers a super-accessible approach to the much-misunderstood subject of statistics.
More information
-->

Home > Statistics Every Writer Should Know > The Stats Board > Discusssion

What is longitudinal data?
Message posted by Ivan on June 12, 2000 at 12:00 AM (ET)

I'm having some trouble finding out what is a longitudinal data?Examples of it and ways to analyze this type of data?
Can anyone give me some help or any pointers about it?


READERS RESPOND:
(In chronological order. Most recent at the bottom.)

Re: What is longitudinal data?
Message posted by Phil on June 14, 2000 at 12:00 AM (ET)

A "longitudinal study" is where you follow a group of subjects over time. Longitudinal data would a data collected by individual over that time period. For example, you might follow a group of subject throughout their teen years and collect data for each one along the way.

The alternative is to do a "developmental study" where you measure or sample at various stages of interest, but the subjects are different at each stage.


Re: What is longitudinal data?
Message posted by Doug Mahoney on June 23, 2000 at 12:00 AM (ET)

Approaches to the analysis of longitudinal data are many depending on the situation. One classical approach is the derived variable approach. This is where the data that has been collected over time is transformed into a single summary measure such as a mean, least squares slope, or some index of relative change over time. These summary indexes then become the "analysis endpoints" for testing differences between groups of patients or correlations between baseline measures. One aspect of longitudinal data that must be addressed in any analysis is the correlation between the sequential measurements. If this correlation is ignored, the estimates for effects will be too large and standard errors will be too small (in general). A common computer package that I use frequently is Proc Mixed (SAS). This is an ANOVA like modeling approach that allows the user to investigate different correlation structures between the repeated measurements and their effect on tests of significance. Another approach is to use GEE (generalized estimating equations) to estimate effects and standard errors. This approach doesn't rely so heavily on the gaussian distribution (ie normal distribution) as the Proc Mixed approach. GEE methodology uses the first two moments of the distribution to obtain these estimates. Also, if you aren't really sure of the underlying correlation structure, you can take a "best guess" approach and the GEE method will do a "sandwhich" type estimate between your guess and the observed correlation structure thus giving what is considered as a "robuts" estimate. So authors to do lit searches on are : PJ Diggle, KY Liang, and SL Zeger ...RD Wolfinger ...AS Bryk and SW Raudenbush .. MJ Crowder and DJ Hand.



Your $5 contribution helps cover part the $500 annual cost of keeping this site online.

Niles Online Sites:RobertNiles.comTheme Park InsiderViolinist.com

RobertNiles.com™, the site, content and services 咀opyright 1996-2002, Robert Niles.
All rights reserved. Questions? Comments? Read my Privacy Policy, or E-mail me!