Experiment and Evaluation of Information Systems

16:194:619

Spring 2008

72229 Section 01†††

W3:10-5:50 PM

CIL-313College Avenue

 

Instructor: Paul Kantor

310 SCILS Building

732 932 7500 x8216 kantor@scils.rutgers.edu

 

Catalog Description: Measures, models and methods for macro-evaluation of impact of information systems within their environment and for micro-evaluation of performance of system components. Experiments - their design, conduct and results

 

Personal Note:Since not every student is interested in Information Systems, I make a reasonable effort to permit students interested in other application areas to develop parallel term paper topics more closely related to their own dissertation interests.This requires somewhat more effort from the student, but yields more immediately applicable results. See comments from recent students at http://www.scils.rutgers.edu/~kantor/t/619phd/2008/StudentComments.html

 

Pre- and Co-requisites

SCILS Quantitative Analysis course (604) or a second course in applied statistics, which has gone beyond simple t-tests and measures of central tendency.A clear understanding of the meaning of confidence intervals is a plus, but is not absolutely necessary.

 

Extended Abstract:

An advanced course in the complete design and interpretation of experimental and survey studies of information and communication systems. Topics include: non-normal distributions; confidence intervals; parameter estimation; experimental design; path analysis; multivariate analysis of variance; design of a protocol and informed consent; elicitation of the stakeholder goals and objectives; management of studies; analysis and interpretation of results; presentation to non-technical audiences. Each student will work with some shared data sets. In addition, students may chose either to work in pairs, on projects specific to this course, or may conduct a study or survey which contributes to practicuum or dissertation research (working, in that case, individually). Thus course is very strongly recommended for students planning a quantitative dissertation, and will prepare them to conduct the necessary qualitative work to design effective instruments. This course is also recommended for students who intend to supplement a qualitative dissertation with some numerical analysis of results such as content analysis, to strengthen that component of the research.

Upon successful completion of this course students will be able to compute ANOVA parameters; create phantom or simulated data; conduct factor analyses and scale development; design, analyze, interpret and report upon an experimental or survey study.

Students will do several homework assignments, and will design and conduct an experiment or survey.

DETAILED LISTING OF TOPICS/MODULES.

This course complements and extends the core courses (603,604) by showing how to bring together qualitative and quantitative methods in the ways that are used in studies of libraries, information systems, and communication systems.

The course will use two extensive sets of data for example purposes, and students will use or develop data related to their own research or studies. Throughout the course we will use SPSS, which is universally accepted in social science research.

The principal modules of the course are:

  1. Quick review of elementary statistics: parametric and non-parametric concepts and tests; t-test; F-test; ANOVA and regression; significance measures. Parameters as random variables.
  2. How to prepare your IRB proposal.Students who do not have IRB certification for the conduct of studies involving Human Subjects should plan to complete that course during the first two weeks of the semester.
  3. Review of Not so elementary statistics. ANOVA again; interaction effects; estimating parameters; defining models the confidence interval as a random variable. Interpretation of confidence intervals.
  4. Moving on. Abnormal distributions. Discrete multinomial and binomial distributions; lognormal and Pareto distributions; description of human behavior and economic phenomena. The Internet; library sizes; Bradford laws. Sample statistics for abnormal distributions are still normal.
  5. Moving on. Multivariate models. Estimating parameters. Selecting models. Split half tests. Building phantom data sets. Managing the data flow process (white envelopes). Formats for recording your analysis.
  6. Defining a study: Phase I. qualitative. goals and objectives. interviewing management. other stakeholders. defining goals of the system to be studied. defining goals of the study itself. [not the same!]. Selecting study method. Experiment; survey; naturally occurring experiment. Model interview guidelines. "Is there something I should have asked you, but I didnít?".
  7. Defining a study Phase II. Learning the respondentís language to communicate the ideas to be studied. Hear and incorporate that language. Are the questions dimensions, or components of a scale?
  8. Defining a study: Phase III. Build candidate models. Focus the data collection. Build measures or scales. Buy? Borrow? Or Build?. Making phantom data to sharpen your own ideas. Managing the data flow.
  9. Defining the study: Part IV. Economics of recruitments and data collection. Building a sample. Stratification. Control variables or analyze their effects. Estimating the power of your study.
  10. Writing the instruments. Clear questions. Single concept per question. Online data collection. Introduction to Survey Monkey. Skip patterns. Exporting data. Using a survey tool in an experimental setting.
  11. ANALYSIS METHODS. Data management for your own project. Select methods. Exploratory techniques. Save and reuse your code. Keep a data analysis journal. Becoming a power user: factor analysis followed by regression or ANOVA; path analysis methods; inferring causality.
  12. RESULTS. Selection and presentation of analytic results. Renaming variables to improve communication. Selecting the appropriate representation for each class of variables. Tufte principles. Identifying the most salient findings of a study. [They may not be what you were looking for].
  13. CONCLUSIONS. Interpretation of the data. Explaining what the results mean: for managers; for other stakeholders; for other scholars. [A bad example is given at the end of this short syllabus. See how many errors you can spot in it.
  14. DISCUSSION. How to point from a completed study towards the appropriate future. Is more study needed? Is change in operations or systems called for? Does the study indicate the nature of that change? If not, why not?. Has the studied justified its own costs?
  15. Presentations of student projects. We usually invite interested faculty to attend these presentations, as they have, in the past, been quite impressive.

 

OUTLINE OF DOING A STUDY

Define objective and protocol

Develop recruitment instruments

Write the informed consent

Obtain IRB approval. Note added requirements for audio or video taping.

 

Grading formula:

Weekly online "calisthenics": 40% (=4% per week; none for last 4 weeks).

Analysis of common data set: 20%

Term project: Paper: 20%; Presentation: 10% (Total=30%)

Class participation: 10%

Grade scale: A: 100-90; B: 89-80; C: 79-70; D: 69-60;