CE679 Regression and Stochastic Methods (Fall 2008)


Instructor: Nikolay S. Strigul
E-mail: nstrigul@stevens.edu
Office Hours: by appointment

Lectures: Rocco bld, 1st floor, Monday 6:15 pm -8:45 pm
Assignments, Exams, Quizzes: There will be regular homework assignments, midterm and final exams.

Grading:

Course program Course program (PDF)

General comments:

CE679 is an introduction to the practical statistical methods for students majoring in sciences and engineering. Statistical reasoning plays a critical role in the modern sciences, as much of real-life problems naturally involve a large amount of uncertainty and randomness. This course will teach students to use regression models as a research tool in experimental studies on particular real-life examples (taken mostly from environmental sciences). This course will introduce stochastic methods using commonly used regression models, such as linear models, polynomial and exponential models, the Monod model (a non-linear regression model widely used in environmental and biomedical research), sigmodal models, and simple hierarchical models. Bayesian approach will be considered in substantial depth. Commercial and free statistical software for both Windows and Linux will be reviewed, and some programs, such as Origin, WinBugs, Statistica and Mathematica will be introduced in greater detail and then widely used throughout the course. Optimal experimental design theory will be introduced and illustrated with on-line based programs presented at www.optimal-design.org. Graduate and advanced undergraduate students from any science and engineering department who are interested in contemporary applied statistics are welcome to register. Particular topics include: Bayesian approach, causal inference, linear and multiple linear regression, non-linear regression models, dose-response models, analysis of variances, optimal experimental design.

Textbooks

1) A Second Course in Statistics: Regression Analysis by W. Mendenhall and T.R. Dye
2) Probability & Statistics for Engineers and Scientists by R.E. Walpole, R.H. Myers, S.L. Myers, K. Ye
3) Mathematical Statistics and Data Analysis by J.A. Rice
4) Bayesian Data Analysis by A. Gelman, J.B. Carlin, H.S. Stern and D.B. Rubin

Course program:



Lecture 1. - Probability Theory

1. The empirical background.
2. The sample space, events.
3. Random variables.
4. Probability distribution functions

Lecture 2. - Discrete probability distributions

5. Discrete uniform distribution
6. Binomial and negative binomial distributions
7. Hypergeometric distribution
8. Poisson distribution

Lecture 3. - Continuous probability distributions

9. The normal distribution
10. Lognormal distribution
11. Gamma distribution
12. Beta distribution
13. Student's t distribution
14. F distribution

Lecture 3. - Fitting probability distributions.

15. Example: fitting the poisson distribution
16. Parameter estimation
17. The method of moments
18. The method of maximum likelihood
19. Efficiency

Lecture 4. - Statistical Hypotheses.

20. Goodness of fit
21. The Neyman-Pearson paradigm
22. Confidence intervals and hypothesis tests
23. Likelihood ratio tests
24. Examples

Lecture 5. - The analysis of variance

25. The one-way layout
26. Normal theory and F test
27. The two-way layout
28. Normal theory for the two-way layout

Lecture 6. - Simple linear regression

29. The simple linear regression model
30. Least squares
31. Properties of the least squares estimators
32. Inferences conserning the regression coefficients

Lecture 7. - Multiple linear regression

33. General form of a multiple regression model
34. Model assumptions
35. Inferences in multiple linear regression

Lecture 8. - Nonlinear regression

36. Transformation to a linear model
37. Non-linear least squares
38. Numerical methods

Lecture 9. - Nonlinear regression models

39. Exponential regression models
40. Michaelis Menten model
41. Monod model
42. Sigmoidal Models

Lecture 10. - Dose response models

43. Logistic model .
44. Dose finding toxicity model
45. Applications and examples

Lecture 11. - Optimal experimental designs

46. Linear regression models
47. Optimality criteria
48. Nonlinear regression models
49. Local and minimax optimal designs

Lecture 12. - Bayesian statistics

50. Bayes formula
51. Posterior analysis
52. Markov Chain Monte Carlo (MCMC) methods
53. The Metropolis-Hastings algorithm and Gibbs sampling

Lecture 13. - Applications of Bayesian methods

54. Introduction to OPENBUGS
55. Bayesian inferences for the normal distribution
56. Hierarchical models
57. Examples

Lecture 14. - Bayesian regression

58. Bayesian linear regression
59. Hierarchical linear models
60. Generalized linear models
61. Nonlinear models

Final exam.

back