Regression Analysis Solutions Essay

Solutions Manual to accompany Applied Linear Statistical Models Fifth Edition Michael H. Kutner Emory University Christopher J. Nachtsheim University of Minnesota John Neter University of Georgia William Li University of Minnesota 2005 McGraw-Hill/Irwin Chicago, IL Boston, MA PREFACE This Solutions Manual gives intermediate and ? nal numerical results for all end-of-chapter Problems, Exercises, and Projects with computational elements contained in Applied Linear Statistical M odels, 5th edition. This Solutions Manual also contains proofs for all Exercises that require derivations.

No solutions are provided for the Case Studies. In presenting calculational results we frequently show, for ease in checking, more digits than are signi? cant for the original data. Students and other users may obtain slightly di? erent answers than those presented here, because of di? erent rounding procedures. When a problem requires a percentile (e. g. of the t or F distributions) not included in the Appendix B Tables, users may either interpolate in the table or employ an available computer program for ? nding the needed value. Again, slightly di? rent values may be obtained than the ones shown here. We have included many more Problems, Exercises, and Projects at the ends of chapters than can be used in a term, in order to provide choice and ? exibility to instructors in assigning problem material. For all major topics, three or more problem settings are presented, and the instructor can select di? erent ones from term to term. Another option is to supply students with a computer printout for one of the problem settings for study and class discussion and to select one or more of the other problem settings for individual computation and solution.

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

By drawing on the basic numerical results in this Manual, the instructor also can easily design additional questions to supplement those given in the text for a given problem setting. The data sets for all Problems, Exercises, Projects and Case Studies are contained in the compact disk provided with the text to facilitate data entry. It is expected that the student will use a computer or have access to computer output for all but the simplest data sets, where use of a basic calculator would be adequate.

For most students, hands-on experience in obtaining the computations by computer will be an important part of the educational experience in the course. While we have checked the solutions very carefully, it is possible that some errors are still present. We would be most grateful to have any errors called to our attention. Errata can be reported via the website for the book: http://www. mhhe. com/KutnerALSM5e. We acknowledge with thanks the assistance of Lexin Li and Yingwen Dong in the checking of Chapters 1-14 of this manual.

We, of course, are responsible for any errors or omissions that remain. Michael H. Kutner Christopher J. Nachtsheim John Neter William Li i ii Contents 1 LINEAR REGRESSION WITH ONE PREDICTOR VARIABLE 2 INFERENCES IN REGRESSION AND CORRELATION ANALYSIS 3 DIAGNOSTICS AND REMEDIAL MEASURES 1-1 2-1 3-1 4 SIMULTANEOUS INFERENCES AND OTHER TOPICS IN REGRESSION ANALYSIS 4-1 5 MATRIX APPROACH TO SIMPLE LINEAR REGRESSION ANALYSIS 5-1 6 MULTIPLE REGRESSION – I 7 MULTIPLE REGRESSION – II 6-1 7-1 MODELS FOR QUANTITATIVE AND QUALITATIVE PREDICTORS 8-1 9 BUILDING THE REGRESSION MODEL I: MODEL SELECTION AND VALIDATION 9-1 10 BUILDING THE REGRESSION MODEL II: DIAGNOSTICS 10-1 11 BUILDING THE REGRESSION MODEL III: REMEDIAL MEASURES 11-1 12 AUTOCORRELATION IN TIME SERIES DATA 12-1 13 INTRODUCTION TO NONLINEAR REGRESSION AND NEURAL NETWORKS 13-1 14 LOGISTIC REGRESSION, POISSON REGRESSION,AND GENERALIZED LINEAR MODELS 14-1 15 INTRODUCTION TO THE DESIGN OF EXPERIMENTAL AND OBSERVATIONAL STUDIES 15-1 16 SINGLE-FACTOR STUDIES 17 ANALYSIS OF FACTOR LEVEL MEANS iii 16-1 17-1 18 ANOVA DIAGNOSTICS AND REMEDIAL MEASURES 8-1 19 TWO-FACTOR ANALYSIS OF VARIANCE WITH EQUAL SAMPLE SIZES 19-1 20 TWO-FACTOR STUDIES – ONE CASE PER TREATMENT 21 RANDOMIZED COMPLETE BLOCK DESIGNS 22 ANALYSIS OF COVARIANCE 23 TWO-FACTOR STUDIES WITH UNEQUAL SAMPLE SIZES 24 MULTIFACTOR STUDIES 25 RANDOM AND MIXED EFFECTS MODELS 20-1 21-1 22-1 23-1 24-1 25-1 26 NESTED DESIGNS, SUBSAMPLING, AND PARTIALLY NESTED DESIGNS 26-1 27 REPEATED MEASURES AND RELATED DESIGNS 27-1 28 BALANCED INCOMPLETE BLOCK, LATIN SQUARE, AND RELATED DESIGNS 28-1 29 EXPLORATORY EXPERIMENTS – TWO-LEVEL FACTORIAL AND FRACTIONAL FACTORIAL DESIGNS 29-1 30 RESPONSE SURFACE METHODOLOGY 30-1

Appendix D: RULES FOR DEVELOPING ANOVA MODELS AND TABLES FOR BALANCED DESIGNS D. 1 iv Chapter 1 LINEAR REGRESSION WITH ONE PREDICTOR VARIABLE 1. 1. 1. 2. 1. 5. 1. 7. No Y = 300 + 2X, functional No a. b. 1. 8. No Yes, . 68 Yes, no 1. 10. No 1. 12. a. Observational 1. 13. a. Observational 1. 18. No 1. 19. a. c. d. 1. 20. a. ? ? 0 = 2. 11405, ? 1 = 0. 03883, Y = 2. 11405 + . 03883X ? Yh = 3. 27895 ? 1 = 0. 03883 ? Y = ? 0. 5802 + 15. 0352X ? d. Yh = 74. 5958 ? Y = 10. 20 + 4. 00X ? b. Yh = 14. 2 c. d. 4. 0 ? ? (X, Y ) = (1, 14. 2) ? Y = 168. 600000 + 2. 034375X 1-1 1. 21. a. 1. 22. a. b. c. 1. 23. a. ? Yh = 249. 975 ? 1 = 2. 034375 i: 1 2 … i : 0. 9676 1. 2274 . . . Yes M SE = 0. 388, v 119 120 -0. 8753 -0. 2532 b. 1. 24. a. M SE = 0. 623, grade points 44 45 1. 4392 2. 4039 i: 1 2 … ei : -9. 4903 0. 4392 . . . e2 = 3416. 377 i Min Q = e2 i b. 1. 25. a. b. 1. 26. a. M SE = 79. 45063, e1 = 1. 8000 v M SE = 8. 913508, minutes e2 = 17. 6000, M SE = 2. 2000, ? 2 i i: 1 ei : -2. 150 i: 7 ei : -2. 425 i: 13 ei : . 025 Yes 2 3 3. 850 -5. 150 8 9 5. 575 3. 300 14 15 -1. 975 3. 025 v 4 -1. 150 10 . 300 16 -3. 975 5 6 . 575 2. 575 12 -3. 700 11 1. 300 b. 1. 27. a. b. M SE = 10. 459, M SE = 3. 234, Brinell units ? Y = 156. 35 ? 1. 19X ? (1) b1 = ? 1. 19, (2) Yh = 84. 5, (3) e8 = 4. 4433, (4) M SE = 66. 8 1. 28. a. ? Y = 20517. 6 ? 170. 575X ? b. (1) b1 = ? 170. 575, (2) Yh = 6871. 6, (3) e10 = 1401. 566, (4) M SE = 5552112 1. 31. No, no 1. 32. Solving (1. 9a) and (1. 9b) for b0 and equating the results: Yi ? b1 n Xi = X i Yi ? b 1 Xi Xi2 1-2 and then solving for b1 yields: n Xi Yi ? Xi Yi = n Xi2 ? ( Xi )2 X i Yi ? X i Yi n ( Xi )2 Xi2 ? n b1 = 1. 33. Q = (Yi ? ?0 )2 (Yi ? ?0 ) dQ = ? 2 d? 0 Setting the derivative equal to zero, simplifying, and substituting the least squares estimator b0 yields: ? (Yi ? b0 ) = 0 or b0 = Y 1 ? 1. 34. E{b0 } = E{Y } = n E{Yi } = 1 n ? 0 = ? 0 1. 35. From the ? st normal equation (1. 9a): Yi = nb0 + b1 1. 36. Xi = (b0 + b1 Xi ) = ei + b1 ? Yi from (1. 13) ei = 0 from (1. 17) and ? Yi ei = (b0 + b1 Xi )ei = b0 Xi ei = 0 from (1. 19). Xi ei = 0 because 1. 38. (1) 76, yes; (2) 60, yes 1. 39. a. ? ? ? Applying (1. 10a) and (1. 10b) to (5, Y1 ), (10, Y2 ) and (15, Y3 ), we obtain: ? ? ? ? ? Y3 ? Y1 4Y1 + Y2 ? 2Y3 b0 = 10 3 Using (1. 10a) and (1. 10b) with the six original points yields the same results. b1 = b. 1. 40. No 1. 41. a. Q = (Yi ? ?1 Xi )2 dQ = ? 2 (Yi ? ?1 Xi )Xi d? 1 Setting the derivative equal to zero, simplifying, and substituting the least squares estimator b1 yields: b1 = b.

L= 1 1 exp ? 2 (Yi ? ?1 Xi )2 2 )1/2 2? i=1 (2?? n Yes Yi Xi Xi2 1-3 It is more convenient to work with loge L : n 1 loge L = ? loge (2?? 2 ) ? 2 2 2? d loge L 1 = (Yi ? ?1 Xi )Xi d? 1 ? 2 (Yi ? ?1 Xi )2 c. Setting the derivative equal to zero, simplifying, and substituting the maximum likelihood estimator b1 yields: Yi Xi (Yi ? b1 Xi )Xi = 0 or b1 = Xi2 Yes Yi Xi 1 E{b1 } = E = Xi E{Yi } 2 Xi Xi2 1 = Xi (? 1 Xi ) = ? 1 Xi2 L(? 1 ) = 1 1 exp[? (Yi ? ?1 Xi )2 ] 32 i=1 32? ?30 L(17) = 9. 45 ? 10 , L(18) = 2. 65 ? 10? 7 , L(19) = 3. 05 ? 10? 37 6 1. 42. a. b. c. d. 1. 43. a. v ?1 = 18 b1 = 17. 928, yes Yes ? Total population: Y = ? 10. 635 + 0. 0027954X ? Number of hospital beds: Y = ? 95. 9322 + 0. 743116X ? Total personal income: Y = ? 48. 3948 + . 131701X c. Total population: M SE = 372, 203. 5 Number of hospital beds: M SE = 310, 191. 9 Total personal income: M SE = 324, 539. 4 1. 44. a. ? Region 1: Y = ? 1723. 0 + 480. 0X ? Region 2: Y = 916. 4 + 299. 3X ? Region 3: Y = 401. 56 + 272. 22X ? Region 4: Y = 396. 1 + 508. 0X c. Region 1: M SE = 64, 444, 465 Region 2: M SE = 141, 479, 673 Region 3: M SE = 50, 242, 464 Region 4: M SE = 514, 289, 367 1. 45. a. ? Infection risk: Y = 6. 3368 + . 7604X ? Facilities: Y = 7. 7188 + . 0447X 1-4 ? X-ray: Y = 6. 664 + . 0378X c. Infection risk: M SE = 2. 638 Facilities: M SE = 3. 221 X-ray: M SE = 3. 147 1. 46. a. ? Region 1: Y = 4. 5379 + 1. 3478X ? Region 2: Y = 7. 5605 + . 4832X ? Region 3: Y = 7. 1293 + . 5251X ? Region 4: Y = 8. 0381 + . 0173X c. Region 1: M SE = 4. 353 Region 2: M SE = 1. 038 Region 3: M SE = . 940 Region 4: M SE = 1. 078 1. 47. a. b. c. L(? 0 , ? 1 ) = 1 1 exp[? (Yi ? ?0 ? ?1 Xi )2 ] 32 i=1 32? b0 = 1. 5969, b1 = 17. 8524 6 v Yes 1-5 1-6 Chapter 2 INFERENCES IN REGRESSION AND CORRELATION ANALYSIS 2. 1. 2. 2. 2. 4. a. Yes, ? = . 05 No a. b. c. 2. 5. a. b. c. d. 2. 6. a. b. c. d. e. t(. 995; 118) = 2. 61814, . 3883 ± 2. 61814(. 01277), . 00540 ? ?1 ? .07226 H0 : ? 1 = 0, Ha : ? 1 = 0. t? = (. 03883 ? 0)/. 01277 = 3. 04072. If |t? | ? 2. 61814, conclude H0 , otherwise Ha . Conclude Ha . 0. 00291 t(. 95; 43) = 1. 6811, 15. 0352 ± 1. 6811(. 4831), 14. 2231 ? ?1 ? 15. 8473 H0 : ? 1 = 0, Ha : ? 1 = 0. t? = (15. 0352 ? 0)/. 4831 = 31. 122. If |t? | ? 1. 681 conclude H0 , otherwise Ha . Conclude Ha . P -value= 0+ Yes H0 : ? 1 ? 14, Ha : ? 1 > 14. t? = (15. 0352 ? 14)/. 4831 = 2. 1428. If t? ? 1. 681 conclude H0 , otherwise Ha . Conclude Ha . P -value= . 0189 t(. 975; 8) = 2. 306, b1 = 4. 0, s{b1 } = . 469, 4. 0 ± 2. 306(. 469), 2. 918 ? 1 ? 5. 082 H0 : ? 1 = 0, Ha : ? 1 = 0. t? = (4. 0 ? 0)/. 469 = 8. 529. If |t? | ? 2. 306 conclude H0 , otherwise Ha . Conclude Ha . P -value= . 00003 b0 = 10. 20, s{b0 } = . 663, 10. 20 ± 2. 306(. 663), 8. 671 ? ?0 ? 11. 729 H0 : ? 0 ? 9, Ha : ? 0 > 9. t? = (10. 20 ? 9)/. 663 = 1. 810. If t? ? 2. 306 conclude H0 , otherwise Ha . Conclude H0 . P -value= . 053 H0 : ? 1 = 0: ? = |2 ? 0|/. 5 = 4, power = . 93 H0 : ? 0 ? 9: ? = |11 ? 9|/. 75 = 2. 67, power = . 78 2. 7. a. t(. 995; 14) = 2. 977, b1 = 2. 0344, s{b1 } = . 0904, 2. 0344 ± 2. 977(. 0904), 1. 765 ? ?1 ? 2. 304 2-1 b. c. 2. 8. a. b. 2. 10. a. b. c. H0 : ? 1 = 2, Ha : ? = 2. t? = (2. 0344 ? 2)/. 0904 = . 381. If |t? | ? 2. 977 conclude H0 , otherwise Ha . Conclude H0 . P -value= . 71 ? = |. 3|/. 1 = 3, power = . 50 H0 : ? 1 = 3. 0, Ha : ? 1 = 3. 0. t? = (3. 57 ? 3. 0)/. 3470 = 1. 643, t(. 975; 23) = 2. 069. If |t? | ? 2. 069 conclude H0 , otherwise Ha . Conclude H0 . ? = |. 5|/. 35 = 1. 43, power = . 30 (by linear interpolation) Prediction Mean response Prediction 2. 12. No, no 2. 13. a. b. c. d. 2. 14. a. b. c. d. 2. 15. a. ? ? Yh = 3. 2012, s{Yh } = . 0706, t(. 975; 118) = 1. 9803, 3. 2012 ± 1. 9803(. 0706), 3. 0614 ? E{Yh } ? 3. 3410 s{pred} = . 6271, 3. 2012 ± 1. 9803(. 6271), 1. 9594 ?

Yh( new) ? 4. 4430 Yes, yes W 2 = 2F (. 95; 2, 118) = 2(3. 0731) = 6. 1462, W = 2. 4792, 3. 2012 ± 2. 4792(. 0706), 3. 0262 ? ?0 + ? 1 Xh ? 3. 3762, yes, yes ? ? Yh = 89. 6313, s{Yh } = 1. 3964, t(. 95; 43) = 1. 6811, 89. 6313 ± 1. 6811(1. 3964), 87. 2838 ? E{Yh } ? 91. 9788 s{pred} = 9. 0222, 89. 6313 ± 1. 6811(9. 0222), 74. 4641 ? Yh(new) ? 104. 7985, yes, yes 87. 2838/6 = 14. 5473, 91. 9788/6 = 15. 3298, 14. 5473 ? Mean time per machine ? 15. 3298 W 2 = 2F (. 90; 2, 43) = 2(2. 4304) = 4. 8608, W = 2. 2047, 89. 6313±2. 2047(1. 3964), 86. 5527 ? ?0 + ? 1 Xh ? 92. 7099, yes, yes ? ? Xh = 2: Yh = 18. 2, s{Yh } = . 663, t(. 95; 8) = 3. 355, 18. 2 ± 3. 355(. 663), 15. 976 ? E{Yh } ? 20. 424 ? ? Xh = 4: Yh = 26. 2, s{Yh } = 1. 483, 26. 2 ± 3. 355(1. 483), 21. 225 ? E{Yh } ? 31. 175 s{pred} = 1. 625, 18. 2 ± 3. 355(1. 625), 12. 748 ? Yh(new) ? 23. 652 ? s{predmean} = 1. 083, 18. 2 ± 3. 355(1. 083), 14. 567 ? Yh(new) ? 21. 833, 44 = 3(14. 567) ? Total number of broken ampules ? 3(21. 833) = 65 W 2 = 2F (. 99; 2, 8) = 2(8. 649) = 17. 298, W = 4. 159 Xh = 2: 18. 2 ± 4. 159(. 663), 15. 443 ? ?0 + ? 1 Xh ? 20. 957 Xh = 4: 26. 2 ± 4. 159(1. 483), 20. 032 ? ?0 + ? 1 Xh ? 32. 368 yes, yes 2. 16. a. ? ? Yh = 229. 631, s{Yh } = . 8285, t(. 99; 14) = 2. 624, 229. 31±2. 624(. 8285), 227. 457 ? E{Yh } ? 231. 805 2-2 b. c. d. b. c. d. e. s{pred} = 3. 338, 229. 631 ± 2. 624(3. 338), 220. 872 ? Yh(new) ? 238. 390 ? s{predmean} = 1. 316, 229. 631 ± 2. 624(1. 316), 226. 178 ? Yh(new) ? 233. 084 Yes, yes W 2 = 2F (. 98; 2, 14) = 2(5. 241) = 10. 482, W = 3. 238, 229. 631 ± 3. 238(. 8285), 226. 948 ? ?0 + ? 1 Xh ? 232. 314, yes, yes 2. 17. Greater, H0 : ? 1 = 0 2. 20. No 2. 21. No 2. 22. Yes, yes 2. 23. a. Source SS df MS Regression 3. 58785 1 3. 58785 Error 45. 8176 118 0. 388285 Total 49. 40545 119 ? ? 2 + ? 2 (Xi ? X)2 , ? 2 , when ? 1 = 0 1 b. c. d. e. f. 2. 24. a. H0 : ? 1 = 0, Ha : ? 1 = 0.

F ? = 3. 58785/0. 388285 = 9. 24, F (. 99; 1, 118) = 6. 855. If F ? ? 6. 855 conclude H0 , otherwise Ha . Conclude Ha . SSR = 3. 58785, 7. 26% or 0. 0726, coe? cient of determination +0. 2695 R2 Source Regression Error Total SS df MS 76,960. 4 1 76,960. 4 3,416. 38 43 79. 4506 80,376. 78 44 Source SS df MS Regression 76,960. 4 1 76,960. 4 Error 3,416. 38 43 79. 4506 Total 80,376. 78 44 Correction for mean 261,747. 2 1 Total, uncorrected 342,124 45 b. c. d. e. H0 : ? 1 = 0, Ha : ? 1 = 0. F ? = 76, 960. 4/79. 4506 = 968. 66, F (. 90; 1, 43) = 2. 826. If F ? ? 2. 826 conclude H0 , otherwise Ha . Conclude Ha . 95. 75% or 0. 9575, coe? ient of determination +. 9785 R2 2-3 2. 25. a. Source Regression Error Total b. c. d. 2. 26. a. Source SS Regression 5,297. 5125 Error 146. 4250 Total 5,443. 9375 b. c. i: 1 ? Yi ? Yi : -2. 150 ? ? Yi ? Y : -24. 4125 i: 7 ? Yi ? Yi : -2. 425 ? ? Yi ? Y : -8. 1375 2 3. 850 -24. 4125 8 5. 575 -8. 1375 3 4 5 6 -5. 150 -1. 150 . 575 2. 575 -24. 4125 -24. 4125 -8. 1375 -8. 1375 9 10 3. 300 . 300 8. 1375 8. 1375 15 3. 025 24. 4125 11 1. 300 8. 1375 12 -3. 700 8. 1375 df 1 14 15 MS 5,297. 5125 10. 4589 SS 160. 00 17. 60 177. 60 df 1 8 9 MS 160. 00 2. 20 H0 : ? 1 = 0, Ha : ? 1 = 0. F ? = 160. 00/2. 20 = 72. 727, F (. 95; 1, 8) = 5. 32.

If F ? ? 5. 32 conclude H0 , otherwise Ha . Conclude Ha . t? = (4. 00 ? 0)/. 469 = 8. 529, (t? )2 = (8. 529)2 = 72. 7 = F ? R2 = . 9009, r = . 9492, 90. 09% H0 : ? 1 = 0, Ha : ? 1 = 0, F ? = 5, 297. 5125/10. 4589 = 506. 51, F (. 99; 1, 14) = 8. 86. If F ? ? 8. 86 conclude H0 , otherwise Ha . Conclude Ha . i: 13 14 ? Yi ? Yi : . 025 -1. 975 ? ? Yi ? Y : 24. 4125 24. 4125 d. 2. 27. a. R2 = . 9731, r = . 9865 16 -3. 975 24. 4125 H0 : ? 1 ? 0, Ha : ? 1 < 0. s{b1 } = 0. 090197, t? = (? 1. 19 ? 0)/. 090197 = ? 13. 193, t(. 05; 58) = ? 1. 67155. If t? ? ? 1. 67155 conclude H0 , otherwise Ha . Conclude Ha . P -value= 0+ c. 2. 28. a. b. . 2. 29. a. t(. 975; 58) = 2. 00172, ? 1. 19 ± 2. 00172(. 090197), ? 1. 3705 ? ?1 ? ?1. 0095 ? ? Yh = 84. 9468, s{Yh } = 1. 05515, t(. 975; 58) = 2. 00172, 84. 9468 ± 2. 00172(1. 05515), 82. 835 ? E{Yh } ? 87. 059 s{Yh(new) } = 8. 24101, 84. 9468 ± 2. 00172(8. 24101), 68. 451 ? Yh(new) ? 101. 443 W 2 = 2F (. 95; 2, 58) = 2(3. 15593) = 6. 31186, W = 2. 512342, 84. 9468 ± 2. 512342(1. 05515), 82. 296 ? ?0 + ? 1 Xh ? 87. 598, yes, yes 2-4 i: 1 ? i : 0. 823243 Yi ? Y ? ? Yi ? Y : 20. 2101 b. 2 … -1. 55675 . . . 22. 5901 . . . 59 -0. 666887 -14. 2998 60 8. 09309 -19. 0598 Source SS Regression 11,627. 5 Error 3,874. 45 Total 15,501. 5 c. df MS 1 11,627. 5 58 66. 8008 59 H0 : ? 1 = 0, Ha : ? 1 = 0. F ? = 11, 627. 5/66. 8008 = 174. 0623, F (. 90; 1, 58) = 2. 79409. If F ? ? 2. 79409 conclude H0 , otherwise Ha . Conclude Ha . 24. 993% or . 24993 R2 = 0. 750067, r = ? 0. 866064 H0 : ? 1 = 0, Ha : ? 1 = 0. s{b1 } = 41. 5743, t? = (? 170. 575 ? 0)/41. 5743 = ? 4. 1029, t(. 995; 82) = 2. 63712. If |t? | ? 2. 63712 conclude H0 , otherwise Ha . Conclude Ha . P -value = 0. 000096 d. e. 2. 30. a. b. 2. 31. a. ?170. 575 ± 2. 63712(41. 5743), ? 280. 2114 ? ?1 ? ?60. 9386 Source SS Regression 93,462,942 Error 455,273,165 Total 548,736,107 df MS 1 93,462,942 82 5,552,112 83 . H0 : ? 1 = 0, Ha : ? 1 = 0. F ? = 93, 462, 942/5, 552, 112 = 16. 8338, F (. 99; 1, 82) = 6. 9544. If F ? ? 6. 9544 conclude H0 , otherwise Ha. Conclude Ha . (t? )2 = (? 4. 102895)2 = 16. 8338 = F ? . [t(. 995; 82)]2 = (2. 63712)2 = 6. 9544 = F (. 99; 1, 82). Yes. SSR = 93, 462, 942, 17. 03% or 0. 1703 -0. 4127 Full: Yi = ? 0 + ? 1 Xi + ? i , reduced: Yi = ? 0 + ? i c. d. 2. 32. a. b. (1) SSE(F ) = 455, 273, 165, (2) SSE(R) = 548, 736, 107, (3) dfF = 82, (4) dfR = 83, (5) F ? = [(548, 736, 107 ? 455, 273, 165)/1] ? [455, 273, 165/82] = 16. 83376, (6) If F ? ? F (. 99; 1, 82) = 6. 95442 conclude H0 , otherwise Ha . . 2. 33. a. b. c. Yes H0 : ? 0 = 7. 5, Ha : ? 0 = 7. 5 Full: Yi = ? 0 + ? 1 Xi + ? i , reduced: Yi ? 7. 5 = ? 1 Xi + ? i Yes, dfR ? dfF = (n ? 1) ? (n ? 2) = 1 2-5 2. 36 Regression model 2. 38. No 2. 39. a. b. c. Normal, mean µ1 = 50, standard deviation ? 1 = 3 Normal, mean E{Y2 |Y1 = 55} = 105. 33, standard deviation ? 2|1 = 2. 40 Normal, mean E{Y1 |Y2 = 95} = 47, standard deviation ? 1|2 = 1. 80 2. 40. (1) No, (2) no, (3) yes 2. 41. No 2. 42. b. c. d. 2. 43. a. b. c. 2. 44. a. b. c. 2. 45. a. b. 2. 46. a. b. .95285, ? 12 v H0 : ? 12 = 0, Ha : ? 12 = 0. t? = (. 95285 13)/ 1 ? (. 95285)2 = 11. 32194, t(. 95; 13) = 3. 012. If |t? | ? 3. 012 conclude H0 , otherwise Ha . Conclude Ha . No v H0 : ? 12 = 0, Ha : ? 12 = 0. t? = (. 61 82)/ 1 ? (. 61)2 = 6. 9709, t(. 975; 82) = 1. 993. If |t? | ? 1. 993 conclude H0 , otherwise Ha . Conclude Ha . z = . 70892, ? {z } = . 1111, z(. 975) = 1. 960, . 70892 ± 1. 960(. 1111), . 49116 ? ? ? .92668, . 455 ? ?12 ? .729 . 207 ? ?2 ? .531 12 v H0 : ? 12 = 0, Ha : ? 12 = 0. t? = (. 87 101)/ 1 ? (. 87)2 = 17. 73321, t(. 95; 101) = 1. 663. If |t? | ? 1. 663 conclude H0 , otherwise Ha . Conclude Ha . z = 1. 33308, ? {z } = . 1, z(. 95) = 1. 645, 1. 33308 ± 1. 645(. 1), 1. 16858 ? ? ? 1. 9758, . 824 ? ?12 ? .905 . 679 ? ?2 ? .819 12 z = 1. 18814, ? {z } = . 0833, z(. 995) = 2. 576, 1. 18814 ± 2. 576(. 0833), . 97356 ? ? ? 1. 40272, . 750 ? ?12 ? .886. .563 ? ?2 ? .785 12 0. 9454874 H0 : There is no association between Y1 and Y2 Ha : There is an association between Y1 and Y2 v 0. 9454874 13 ? = 10. 46803. t(0. 995, 13) = 3. 012276. If |t? | ? 3. 012276, t = 2 1 ? (0. 9454874) conclude H0 , otherwise, conclude Ha . Conclude Ha . -0. 866064, v H0 : ? 12 = 0, Ha : ? 12 = 0. t? = (? 0. 866064 58)/ 1 ? (? 0. 866064)2 = ? 13. 19326, t(. 975; 58) = 2. 00172. If |t? | ? 2. 00172 conclude H0 , otherwise Ha .

Conclude Ha . 2-6 2. 47. a. b. c. d. -0. 8657217 H0 : There is no association between X and Y Ha : There is an association between X and Y v ? 0. 8657217 58 t? = = ? 13. 17243. t(0. 975, 58) = 2. 001717. If |t? | ? 2 1 ? (? 0. 8657217) 2. 001717, conclude H0 , otherwise, conclude Ha . Conclude Ha . ?0. 4127033 v H0 : ? 12 = 0, Ha : ? 12 = 0. t? = (? 0. 4127033 82)/ 1 ? (? 0. 4127033)2 = ? 4. 102897, t(. 995; 82) = 2. 637123. If |t? | ? 2. 637123 conclude H0 , otherwise Ha . Conclude Ha . 2. 48. a. b. 2. 49. a. b. -0. 4259324 H0 : There is no association between X and Y Ha : There is an association between X and Y v ? . 4259324 58 t? = = ? 4. 263013. t(0. 995, 80) = 2. 637123. If |t? | ? 1 ? (? 0. 4259324)2 2. 637123, conclude H0 , otherwise, conclude Ha . Conclude Ha . ki Xi = = = ? Xi ? X Xi ? (Xi ? X)2 ? ? (Xi ? X)(Xi ? X) because ? (Xi ? X)2 ? (Xi ? X)2 ? =1 (Xi ? X)2 2. 50. ? ? (Xi ? X)X ? =0 (Xi ? X)2 ? ? 2. 51. E{b0 } = E{Y ? b1 X} = = 1 n 1 n ? E{Yi } ? XE{b1 } ? (? 0 + ? 1 Xi ) ? X? 1 ? ? = ? 0 + ? 1 X ? X? 1 = ? 0 ? ? 2. 52. ? 2 {b0 } = ? 2 {Y ? b1 X} ? ? ? ? = ? 2 {Y } + X 2 ? 2 {b1 } ? 2X? {Y , b1 } = ? 2 ? + X2 n 1 + n n ?2 ? ?0 (Xi ? X)2 ? X2 ? (Xi ? X)2 = ? 2 2. 53. a. L= v i=1 1 1 exp ? 2 (Yi ? ?0 ? 1 Xi )2 g(Xi ) 2 2? 2?? 2-7 b. Maximum likelihood estimators can be found more easily by working with loge L: n 1 loge L = ? loge (2?? 2 ) ? 2 (Yi ? ?0 ? ?1 Xi )2 + loge g(Xi ) 2 2? ? loge L 1 = 2 (Yi ? ?0 ? ?1 Xi ) ?? 0 ? ? loge L 1 = 2 (Yi ? ?0 ? ?1 Xi )(Xi ) ?? 1 ? ? loge L n 1 1 1 (Yi ? ?0 ? ?1 Xi )2 =? + 2 2 ?? 2 ? 2 ? 4 Setting each derivative equal to zero, simplifying, and substituting the maximum likelihood estimators b0 , b1 , and ? 2 yields: ? (1) (2) (3) Yi ? nb0 ? b1 Yi Xi ? b0 Xi = 0 Xi2 = 0 Xi ? b1 (Yi ? b0 ? b1 Xi )2 = ? 2 ? n Equations (1) and (2) are the same as the least squares normal equations (1. ), hence the maximum likelihood estimators b0 and b1 are the same as those in (1. 27). 2. 54. Yes, no ? ? ? 2. 55. SSR = (Yi ? Y )2 = [(b0 + b1 Xi ) ? Y ]2 ? ? b1 X) + b1 Xi ? Y ]2 ? ? = [(Y ? = b2 (Xi ? X)2 1 2. 56. a. b. 2. 57. a. b. E{M SR} = 1, 026. 36, E{M SE} = . 36 E{M SR} = 90. 36, E{M SE} = . 36 Yi ? 5Xi = ? 0 + ? i , n ? 1 Yi ? 2 ? 5Xi = ? i , n 2. 58. If ? 12 = 0, (2. 74) becomes: 1 1 f (Y1 , Y2 ) = exp ? 2?? 1 ? 2 2 =v 1 1 Y 1 ? µ1 exp ? 2 ? 1 2?? 1 2 Y 1 ? µ1 ? 1 ·v 2 Y2 ? µ 2 + ? 2 2 1 1 Y 2 ? µ2 exp ? 2 ? 2 2?? 2 2 = f1 (Y1 ) · f2 (Y2 ) n 2. 59. a. L= i=1 1 2?? 1 ? 2 1 ? ?2 12 ? exp{? Yi1 ? µ1 2 1 [( ) 2(1 ? ?2 ) ? 12 Yi1 ? µ1 Yi2 ? µ2 Yi2 ? µ2 2 )( )+( ) ]} ? 1 ? 2 ? 2 Maximum likelihood estimators can be found more easily by working with loge L: ? 2? 12 ( 2-8 loge L = ? n loge 2? ? n loge ? 1 ? n loge ? 2 ? n loge (1 ? ?2 ) 12 2 n Yi1 ? µ1 2 1 Yi1 ? µ1 Yi2 ? µ2 [( ? ) ? 2? 12 ( )( ) 2 2(1 ? ?12 ) i=1 ? 1 ? 1 ? 2 +( Yi2 ? µ2 2 )] ? 2 ? loge L 1 = 2 ? µ1 ? 1 (1 ? ?2 ) 12 ? loge L 1 = 2 ? µ2 ? 2 (1 ? ?2 ) 12 ?12 ? 1 ? 2 (1 ? ?2 ) 12 ? 12 (Yi2 ? µ2 ) ? ?1 ? 2 (1 ? ?2 ) 12 (Yi1 ? µ1 ) ? (Yi1 ? µ1 )2 ? ?12 3 ? 1 (Yi2 ? µ2 ) (Yi1 ? µ1 ) ? loge L n 1 =? + ?? 1 ? 1 (1 ? ?12 )2 n 1 ? loge L =? + ?? 2 ? 2 (1 ? ?12 )2 n? 12 1 ? loge L = + 2 ?? 12 1 ? 12 1 ? ?2 12 ? (Yi1 ? µ1 )(Yi2 ? µ2 ) 2 ? 1 ? 2 (Yi1 ? µ1 )(Yi2 ? µ2 ) (Yi2 ? µ2 )2 ? ?12 3 2 ? 2 ? 1 ? 2 Yi1 ? µ1 ? 12 Yi2 ? µ2 ? ?1 ? 2 (1 ? ?2 )2 12 Yi1 ? µ1 2 Yi2 ? µ2 2 Yi1 ? µ1 Yi2 ? µ2 ? 2? 12 + ? 1 ? 1 ? 2 ? 2 Setting the derivatives equal to zero, simplifying, and substituting the maximum likelihood estimators µ1 , µ2 , ? 1 , ? 2 , and ? 12 yields: ? ? ? ? ? 1 ? ? (1) (Yi1 ? µ1 ) ? 12 ? (Yi2 ? µ2 ) = 0 ? ?1 ? ?2 ? 1 ? ? (2) (Yi2 ? µ2 ) ? 12 ? (Yi1 ? µ1 ) = 0 ? ?2 ? ?1 ? (Yi1 ? µ1 )2 ? (Yi1 ? µ1 )(Yi2 ? µ2 ) ? ? (3) ? n(1 ? ?2 ) = 0 ? 12 ? ?12 ? 2 ? 1? 2 ? ? ? 1 ? (4) (5) (Yi2 ? µ2 )2 ? ? ? 12 ? 2 ? 2 ? (Yi1 ? µ1 )(Yi2 ? 2 ) ? ? ? n(1 ? ?2 ) = 0 ? 12 ? 1? 2 ? ? Yi1 ? µ1 ? ?1 ? 2 n? 12 (1 ? ?2 ) + (1 + ? 2 ) ? ?12 ? 12 ? ?? 12 ? Yi1 ? µ1 ? ? ? 1 ? 2 ? Yi2 ? µ2 ? ?2 ? Yi2 ? µ2 ? + ? 2 ? ?=0 Solving equations (1) and (2) yields: ? ? µ1 = Y 1 ? µ2 = Y 2 ? Using these results in equations (3), (4), and (5), it will be found that the maximum likelihood estimators are: ? (Yi1 ? Y1 )2 ? ? µ1 = Y 1 ? µ2 = Y 2 ? ?1 = ? n ? ? ? (Yi1 ? Y1 )(Yi2 ? Y2 ) (Yi2 ? Y2 )2 ? 12 = ? ?2 = ? 1 ? ? 1 n [ (Yi1 ? Y1 )2 ] 2 [ (Yi2 ? Y2 )2 ] 2 2-9 b. ?1|2 = µ1 ? µ2 ? 12 ? ? ? ? ? ? = Y1 ? Y2 ? ? = Y1 ? Y2 ? 1 ? ? ? 12 = ? 12 ? ?2 ? ?1 ? ?2 ? ? ? (Yi1 ? Y1 )(Yi2 ? Y2 ) ? ? ? 1 [ (Yi1 ? Y1 )2 ] 2 [ (Yi2 ? Y2 )2 ] 2 ? ? (Yi1 ? Y1 )(Yi2 ? Y2 ) ? (Yi2 ? Y2 )2 ? ? (Yi1 ? Y1 )2 /n ? (Yi2 ? Y2 )2 /n ? ? ? ? (Yi1 ? Y1 )(Yi2 ? Y2 ) ? = ? 1 )2 ] 1 [ (Yi2 ? Y2 )2 ] 1 ? 2 2 [ (Yi1 ? Y ? ? (Yi1 ? Y1 )(Yi2 ? Y2 ) = ? (Yi2 ? Y2 )2 ? ? (Yi1 ? Y1 )2 /n ? (Yi2 ? Y2 )2 /n ? ? c. ? 2 = ? 2 (1 ? ?2 ) ? 1|2 ? 1 ? 12 ? ? ? (Yi1 ? Y1 )2 [ (Yi1 ? Y1 )(Yi2 ? Y2 )]2 = 1? ? ? n (Yi1 ? Y1 )2 (Yi2 ? Y2 )2 ? ? ? (Yi1 ? Y1 )2 [ (Yi1 ? Y1 )(Yi2 ? Y2 )]2 = ? ? n n (Yi2 ? Y2 )2 The equivalence is shown by letting Yi1 and Yi2 in part (b) be Yi and Xi , respectively. 2. 60. Using regression notation and letting ? (Xi ? X)2 = (n ? )s2 X and ? (Yi ? Y )2 = (n ? 1)s2 , Y 1 2 we have from (2. 84) with Yi1 = Yi and Yi2 = Xi sY b1 = r12 since b1 = sX SSE = ? (Yi ? Y )2 ? (Xi ? X)2 r12 ? ? [ (Xi ? X)(Yi ? Y )]2 ? (Yi ? Y )2 ? ? (Xi ? X)2 2 2 = (n ? 1)s2 ? r12 (n ? 1)s2 = (n ? 1)s2 (1 ? r12 ) Y Y Y s2 {b1 } = Hence: 2 2 s2 (1 ? r12 ) (n ? 1)s2 (1 ? r12 ) Y ? (n ? 1)s2 = Y X n? 2 (n ? 2)s2 X 2 sY 1 ? r12 b1 sY = r12 ? v = s{b1 } sX n ? 2 sX v n ? 2 r12 1? 2 r12 = t? 2-10 2. 61. ? ? 2 ? (Yi1 ? Y1 )(Yi2 ? Y2 ) ? ?(Yi1 ? Y1 )2 ? ?(Yi1 ? Y1 )2 SSR(Y1 ) = ? SST O ? (Yi2 ? Y2 )2 ? ? [ (Yi1 ? Y1 )(Yi2 ? Y2 )]2 = ? ? (Yi1 ? Y1 )2 (Yi2 ? Y2 )2 ? ? 2 (Yi2 ? Y2 )(Yi1 ?

Y1 ) ? (Yi2 ? Y2 )2 ? 2 )2 (Yi2 ? Y SSR(Y2 ) = ? SST O (Yi1 ? Y1 )2 ? ? [ (Yi2 ? Y2 )(Yi1 ? Y1 )]2 = ? ? (Yi1 ? Y1 )2 (Yi2 ? Y2 )2 2. 62. Total population: R2 = 0. 884067 Number of hospital beds: R2 = 0. 903383 Total personal income: R2 = 0. 898914 2. 63. Region 1: 480. 0 ± 1. 66008(110. 1), 297. 2252 ? ?1 ? 662. 7748 Region 2: 299. 3 ± 1. 65936(154. 2), 43. 42669 ? ?1 ? 555. 1733 Region 3: 272. 22 ± 1. 65508(70. 34), 155. 8017 ? ?1 ? 388. 6383 Region 4: 508. 0 ± 1. 66543(359. 0), ? 89. 88937 ? ?1 ? 1105. 889 Infection rate: R2 = . 2846 Facilities: R2 = . 1264 X-ray: R2 = . 1463 2. 64. 2. 65. Region 1: 1. 3478 ± 2. 056(. 316), . 981 ? ?1 ? 1. 9975 Region 2: . 4832 ± 2. 042(. 137), . 2034 ? ?1 ? .7630 Region 3: . 5251 ± 2. 031(. 111), . 2997 ? ?1 ? .7505 Region 4: . 0173 ± 2. 145(. 306), ?. 6391 ? ?1 ? .6737 2. 66. a. E{Yh } = 36 when Xh = 4, E{Yh } = 52 when Xh = 8, E{Yh } = 68 when Xh = 12, E{Yh } = 84 when Xh = 16, E{Yh } = 100 when Xh = 20 25 = . 3953 160 Expected proportion is . 95 E{b1 } = 4, ? {b1 } = c. d. 2-11 Chapter 3 DIAGNOSTICS AND REMEDIAL MEASURES 3. 3. b. and c. i: ? Yi : ei : d. Ascending order: 1 2 3 … Ordered residual: -2. 74004 -1. 83169 -1. 24373 . . . Expected value: -1. 59670 -1. 37781 -1. 25706 . . . 119 120 0. 99441 1. 22737 1. 7781 1. 59670 1 2 3 … 2. 92942 2. 65763 3. 20121 . . . 0. 967581 1. 22737 0. 57679 . . . 118 119 3. 20121 2. 73528 0. 71279 -0. 87528 120 3. 20121 -0. 25321 e. H0 : Normal, Ha : not normal. r = 0. 97373. If r ? .987 concludeH0 , otherwise Ha . Conclude Ha . ? ? n1 = 65, d1 = 0. 43796, n2 = 55, d2 = 0. 50652, s = 0. 417275, t? = (0. 43796 ? BF 0. 50652)/0. 417275 (1/65) + (1/55) = ? 0. 89674, t(. 995; 18) = 2. 61814. If |t? | ? BF 2. 61814 conclude error variance constant, otherwise error variance not constant. Conclude error variance constant. i: ? i : Y ei : 1 2 … 29. 49034 59. 56084 . . . -9. 49034 0. 43916 . . . 44 45 59. 6084 74. 59608 1. 43916 2. 40392 44 45 14. 40392 15. 40392 16. 04643 19. 63272 3. 4. c and d. e. Ascending order: 1 2 … Ordered residual: -22. 77232 -19. 70183 . . . Expected value: -19. 63272 -16. 04643 . . . H0 : Normal, Ha : not normal. r = 0. 9891. If r ? .9785 conclude H0 , otherwise Ha . Conclude H0 . g. 2 SSR? = 15, 155, SSE = 3416. 38, XBP = (15, 155/2) ? (3416. 38/45)2 = 1. 314676, 2 ? 2 (. 95; 1) = 3. 84. If XBP ? 3. 84 conclude error variance constant, otherwise error variance not constant. Conclude error variance constant. 3. 5. c. 3-1 i: 1 ei : 1. 8 e. 2 3 4 -1. 2 -1. 2 1. 8 5 6 -. 2 -1. 2 7 -2. 2 8 . 8 9 . 8 0 . 8 7 8 . 8 . 8 . 6 1. 0 conclude 9 1. 8 1. 5 H0 , 10 1. 8 2. 3 otherwise Ha . Ascending Order: 1 2 Ordered residual: -2. 2 -1. 2 Expected value: -2. 3 -1. 5 H0 : Normal, Ha : not normal. r Conclude H0 . g. 3 4 5 6 -1. 2 -1. 2 -. 2 . 8 -1. 0 -. 6 -. 2 . 2 = . 961. If r ? .879 2 SSR? = 6. 4, SSE = 17. 6, XBP = (6. 4/2) ? (17. 6/10)2 = 1. 03, ? 2 (. 90; 1) = 2. 71. 2 If XBP ? 2. 71 conclude error variance constant, otherwise error variance not constant. Conclude error variance constant. Yes. 3. 6. a and b. i: ei : ? Yi : i: ei : ? Yi : i: ei : ? Yi : c. and d. Ascending order: 1 2 3 4 5 6 Ordered residual: -5. 150 -3. 975 -3. 00 -2. 425 -2. 150 -1. 975 Expected value -5. 720 -4. 145 -3. 196 -2. 464 -1. 841 -1. 280 e? : -1. 592 -1. 229 -1. 144 -. 750 -. 665 -. 611 i Ascending order: 7 8 9 10 11 12 Ordered residual: -1. 150 . 025 . 300 . 575 1. 300 2. 575 Expected value: -. 755 -. 250 . 250 . 755 1. 280 1. 841 e? : -. 356 . 008 . 093 . 178 . 402 . 796 i Ascending order: 13 14 15 16 Ordered residual: 3. 025 3. 300 3. 850 5. 575 Expected value: 2. 464 3. 196 4. 145 5. 720 e? : . 935 1. 020 1. 190 1. 724 i H0 : Normal, Ha : not normal. r = . 992. If r ? .941 conclude H0 , otherwise Ha . Conclude H0 . t(. 25; 14) = ?. 692, t(. 50; 14) = 0, t(. 75; 14) = . 92 Actual: e. 4/16 7/16 11/16 ? 1 = 2. 931, n2 = 8, d2 = 2. 194, s = 1. 724, ? n1 = 8, d t? = (2. 931 ? 2. 194)/1. 724 (1/8) + (1/8) = . 86, t(. 975; 14) = 2. 145. If |t? | ? BF BF 2. 145 conclude error variance constant, otherwise error variance not constant. Conclude error variance constant. 3-2 1 2 3 4 -2. 150 3. 850 -5. 150 -1. 150 201. 150 201. 150 201. 150 201. 150 7 8 9 -2. 425 5. 575 3. 300 217. 425 217. 425 233. 700 10 . 300 233. 700 5 . 575 217. 425 11 1. 300 233. 700 6 2. 575 217. 425 12 -3. 700 233. 700 13 14 15 16 . 025 -1. 975 3. 025 -3. 975 249. 975 249. 975 249. 975 249. 975 3. 7. b and c. i: ei : ? Yi : d.

Ascending order: 1 2 … Ordered residual: -16. 13683 -13. 80686 . . . Expected value: -18. 90095 -15. 75218 . . . 59 60 13. 95312 23. 47309 15. 75218 18. 90095 1 0. 82324 105. 17676 2 … -1. 55675 . . . 107. 55675 . . . 59 60 -0. 66689 8. 09309 70. 66689 65. 90691 H0 : Normal, Ha : not normal. r = 0. 9897. If r ? 0. 984 conclude H0 , otherwise Ha . Conclude H0 . e. SSR? = 31, 833. 4, SSE = 3, 874. 45, 2 2 XBP = (31, 833. 4/2) ? (3, 874. 45/60)2 = 3. 817116, ? 2 (. 99; 1) = 6. 63. If XBP ? 6. 63 conclude error variance constant, otherwise error variance not constant. Conclude error variance constant. Yes. 3. 8. b and c. i: ei : ?

Yi : d. Ascending order: 1 2 … Ordered residual: -5278. 310 -3285. 062 . . . Expected value: -5740. 725 -4874. 426 . . . 83 84 4623. 566 6803. 265 4874. 426 5740. 725 1 2 … 591. 964 1648. 566 . . . 7895. 036 6530. 434 . . . 83 621. 141 6359. 859 84 28. 114 7553. 886 e. H0 : Normal, Ha : not normal. r = 0. 98876. If r ? 0. 9854 conclude H0 , otherwise Ha . Conclude H0 . ? ? n1 = 8, d1 = 1751. 872, n2 = 76, d2 = 1927. 083, s = 1327. 772, t? = (1751. 872 ? 1927. 083)/1327. 772 (1/8) + (1/76) = ? 0. 35502, t(. 975; 82) = BF 1. 98932. If |t? | ? 1. 98932 conclude error variance constant, otherwise error BF variance not constant.

Conclude error variance constant. 3. 10. b. 4, 4 3. 11. b. 2 SSR? = 330. 042, SSE = 59. 960, XBP = (330. 042/2) ? (59. 960/9)2 = 3. 72, 2 2 ? (. 95; 1) = 3. 84. If XBP ? 3. 84 conclude error variance constant, otherwise error variance not constant. Conclude error variance constant. 3. 13. a. b. H0 : E{Y } = ? 0 + ? 1 X, Ha : E{Y } = ? 0 + ? 1 X SSP E = 2797. 66, SSLF = 618. 719, F ? = (618. 719/8)? (2797. 66/35) = 0. 967557, F (. 95; 8, 35) = 2. 21668. If F ? ? 2. 21668 conclude H0 , otherwise Ha . Conclude H0 . H0 : E{Y } = ? 0 + ? 1 X, Ha : E{Y } = ? 0 + ? 1 X. SSP E = 128. 750, SSLF = 17. 675, F ? = (17. 675/2) ? (128. 50/12) = . 824, F (. 99; 2, 12) = 6. 93. If F ? ? 6. 93 conclude H0 , otherwise Ha . Conclude H0 . 3-3 3. 14. a. 3. 15. a. b. ? Y = 2. 57533 ? 0. 32400X H0 : E{Y } = ? 0 + ? 1 X, Ha : E{Y } = ? 0 + ? 1 X. SSP E = . 1575, SSLF = 2. 7675, F ? = (2. 7675/3) ? (. 1575/10) = 58. 5714, F (. 975; 3, 10) = 4. 83. If F ? ? 4. 83 conclude H0 , otherwise Ha . Conclude Ha . ?: -. 2 -. 1 0 SSE: . 1235 . 0651 . 0390 . 1 . 2 . 0440 . 0813 3. 16. b. c. e. ? Y = . 65488 ? .19540X i: 1 2 3 4 5 6 7 8 ei : -. 051 . 058 . 007 -. 083 -. 057 . 035 . 012 . 086 ? i : -1. 104 -1. 104 -1. 104 -. 713 -. 713 -. 713 -. 322 -. 322 Y Expected value: -. 47 . 062 . 000 -. 086 -. 062 . 035 . 008 . 086 i: 9 10 11 12 13 14 15 ei : . 046 . 018 -. 008 -. 039 -. 006 -. 050 . 032 ? Yi : -. 322 . 069 . 069 . 069 . 459 . 459 . 459 Expected value: . 047 . 017 -. 017 -. 026 -. 008 -. 035 . 026 f. 3. 17. b. ? Y =antilog10 (. 65488 ? .19540X) = 4. 51731(. 63768)X ? : . 3 . 4 SSE: 1099. 7 967. 9 . 5 916. 4 . 6 942. 4 . 7 1044. 2 c. e. ? Y = 10. 26093 + 1. 07629X i: 1 ei : -. 36 ? Yi : 10. 26 Expected value: -. 24 i: 6 ei : -. 41 ? Yi : 15. 64 Expected value: -. 36 2 . 28 11. 34 . 14 7 . 10 16. 72 . 04 3 . 31 12. 41 . 36 8 -. 47 17. 79 -. 56 4 5 -. 15 . 30 13. 49 14. 57 -. 14 . 24 9 10 . 7 -. 07 18. 87 19. 95 . 56 -. 04 f. 3. 18. b. d. ? Y = (10. 26093 + 1. 07629X)2 ? Y = 1. 25470 ? 3. 62352X i: 1 ei : -1. 00853 ? Yi : 15. 28853 Expected value: -0. 97979 v ? Y = 1. 25470 ? 3. 62352 X 2 3 … -3. 32526 1. 64837 . . . 12. 12526 10. 84163 . . . -3. 10159 1. 58857 . . . 110 -0. 67526 12. 12526 -0. 59149 111 0. 49147 15. 28853 0. 36067 e. 3-4 3. 21. = ? (Yij ? Yij )2 = ? (Yij ? Yj )2 + ? ? ? 2 (Yij ? Yj ) + (Yj ? Yij ) ? ? ? ? ? (Yj ? Yij )2 + 2 (Yij ? Yj )(Yj ? Yij ) ? ? ? Now, (Yij ? Yj )(Yj ? Yij ) 2 ? ? ? ? ? = Yij Yj ? Yj ? Yij Yij + Yj Yij ? ? ? ? ? ? = nj Yj2 ? nj Yj2 ? Yij nj Yj + nj Yj Yij = 0 j j j j since Yij = b0 + b1 Xj is independent of i. 3. 22. E{M SP E} = E = 1 n? c ? (Yij ? Yj )2 1 = E{(nj ? 1)s2 } j n? c n? c ? 2 E{? 2 ? 2 (nj ? 1)} = (nj ? 1) = ? 2 n? c 3. 23. Full: Yij = µj + ? ij , reduced: Yij = ? 1 Xj + ? ij dfF = 20 ? 10 = 10, dfR = 20 ? 1 = 19 3. 24. a. ? Y = 48. 66667 + 2. 33333X i: 1 2 3 ei : 2. 6667 -. 3333 -. 3333 4 5 6 7 -1. 0000 -4. 0000 -7. 6667 13. 3333 8 -2. 6667 b. c. 3. 27. b. ? Y = 53. 06796 + 1. 62136X ? Yh = 72. 52428, s{pred} = 3. 0286, t(. 995; 5) = 4. 032, 72. 52428 ± 4. 032(3. 0286), 60. 31296 ? Yh(new) ? 84. 73560, yes ? Y = 6. 84922 + . 60975X ? Xh = 6. 5: Yh = 10. 81260, s{pred} = 1. 583, t(. 975; 109) = 1. 982, 10. 81260 ± 1. 982(1. 2583), 8. 31865 ? Yh(new) ? 13. 30655 ? Xh = 5. 9: Yh = 10. 44675, s{pred} = 1. 2512, 10. 44675 ± 1. 982(1. 2512), 7. 96687 ? Yh(new) ? 12. 92663 Yes 3. 29. a. Band 1 2 3 4 b. Median X Y 2 23. 5 4 57 5 81. 5 7 111 F (. 90; 2, 43) = 2. 43041, W = 2. 204727 Xh = 2: 29. 4903 ± 2. 204727(2. 00609), 25. 067 ? E{Yh } ? 33. 913 Xh = 4: 59. 5608 ± 2. 204727(1. 43307), 56. 401 ? E{Yh } ? 62. 720 Xh = 5: 74. 5961 ± 2. 204727(1. 32983), 71. 664 ? E{Yh } ? 77. 528 3-5 Xh = 7: 104. 667 ± 2. 204727(1. 6119), 101. 113 ? E{Yh } ? 108. 221 No c. Neighborhood 1 2 3 4 5 6 3. 30. a. Band 1 2 3 4 5 b.

Neighborhood 1 2 3 4 5 6 7 c. Xc 1 2 3 4 5 6 7 ? Yc 131. 67 158. 33 187. 00 210. 33 245. 33 271. 67 319. 00 Median X Y 0. 5 116. 5 2. 5 170. 0 4. 5 226. 5 6. 5 291. 5 8. 5 384. 5 Xc 2 3 4 5 6 7 ? Yc 27. 000 43. 969 60. 298 77. 905 93. 285 107. 411 F (. 95; 2, 8) = 4. 46, W = 2. 987 Xh = 1: 124. 061 ± 2. 987(7. 4756), 101. 731 ? E{Yh } ? 146. 391 Xh = 2: 156. 558 ± 2. 987(6. 2872), 137. 778 ? E{Yh } ? 175. 338 Xh = 3: 189. 055 ± 2. 987(5. 3501), 173. 074 ? E{Yh } ? 205. 036 Xh = 4: 221. 552 ± 2. 987(4. 8137), 207. 174 ? E{Yh } ? 235. 931 Xh = 5: 254. 049 ± 2. 987(4. 8137), 239. 671 ? E{Yh } ? 268. 428 Xh = 6: 286. 546 ± 2. 87(5. 3501), 270. 565 ? E{Yh } ? 302. 527 Xh = 7: 319. 043 ± 2. 987(6. 2872), 300. 263 ? E{Yh } ? 337. 823 Yes 3-6 Chapter 4 SIMULTANEOUS INFERENCES AND OTHER TOPICS IN REGRESSION ANALYSIS 4. 1. 4. 2. 4. 3. No, no 90 percent a. b. Opposite directions, negative tilt B = t(. 9875; 43) = 2. 32262, b0 = ? 0. 580157, s{b0 } = 2. 80394, b1 = 15. 0352, s{b1 } = 0. 483087 ? 0. 580157 ± 2. 32262(2. 80394) 15. 0352 ± 2. 32262(0. 483087) c. 4. 4. a. b. Yes Opposite directions, negative tilt B = t(. 9975; 8) = 3. 833, b0 = 10. 2000, s{b0 } = . 6633, b1 = 4. 0000, s{b1 } = . 4690 10. 2000 ± 3. 833(. 6633) 4. 0000 ± 3. 833(. 4690) 4. 5. . 7. 658 ? ?0 ? 12. 742 2. 202 ? ?1 ? 5. 798 ? 7. 093 ? ?0 ? 5. 932 13. 913 ? ?1 ? 16. 157 B = t(. 975; 14) = 2. 145, b0 = 168. 6000, s{b0 } = 2. 6570, b1 = 2. 0344, s{b1 } = . 0904 168. 6000 ± 2. 145(2. 6570) 2. 0344 ± 2. 145(. 0904) 162. 901 ? ?0 ? 174. 299 1. 840 ? ?1 ? 2. 228 b. 4. 6. a. Negatively, no B = t(. 9975; 14) = 2. 91839, b0 = 156. 347, s{b0 } = 5. 51226, b1 = ? 1. 190,s{b1 } = 0. 0901973 156. 347 ± 2. 91839(5. 51226) ? 1. 190 ± 2. 91839(0. 0901973) 140. 260 ? ?0 ? 172. 434 ? 1. 453 ? ?1 ? ?0. 927 b. Opposite directions 4-1 c. 4. 7. a. No F (. 90; 2, 43) = 2. 43041, W = 2. 204727 Xh = 3: 44. 5256 ± 2. 204727(1. 7501) 40. 833 ? E{Yh } ? 48. 219 Xh = 5: 74. 5961 ± 2. 204727(1. 32983) 71. 664 ? E{Yh } ? 77. 528 Xh = 7: 104. 667 ± 2. 204727(1. 6119) 101. 113 ? E{Yh } ? 108. 221 b. c. F (. 90; 2, 43) = 2. 43041, S = 2. 204727; B = t(. 975; 43) = 2. 01669; Bonferroni Xh = 4: 59. 5608 ± 2. 01669(9. 02797) 41. 354 ? Yh(new) ? 77. 767 Xh = 7: 104. 667 ± 2. 01669(9. 05808) 86. 3997 ? Yh(new) ? 122. 934 F (. 95; 2, 8) = 4. 46, W = 2. 987 Xh = 0: 10. 2000 ± 2. 987(. 6633) 8. 219 ? E{Yh } ? 12. 181 Xh = 1: 14. 2000 ± 2. 987(. 4690) 12. 799 ? E{Yh } ? 15. 601 Xh = 2: 18. 2000 ± 2. 987(. 6633) 16. 219 ? E{Yh } ? 20. 181 4. 8. a. b. c. B = t(. 9167; 8) = 3. 016, yes F (. 95; 3, 8) = 4. 07, S = 3. 494 Xh = 0: 10. 2000 ± 3. 494(1. 6248) 4. 523 ? Yh(new) ? 15. 877 Xh = 1: 14. 2000 ± 3. 494(1. 5556) 8. 765 ? Yh(new) ? 19. 635 Xh = 2: 18. 2000 ± 3. 494(1. 6248) 12. 523 ? Yh(new) ? 23. 877 d. 4. 9. a. B = 3. 016, yes B = t(. 9833; 14) = 2. 360 Xh = 20: 209. 2875 ± 2. 360(1. 0847) 206. 727 ? E{Yh } ? 211. 847 Xh = 30: 229. 6312 ± 2. 360(0. 8285) 227. 676 ? E{Yh } ? 231. 586 Xh = 40: 249. 9750 ± 2. 360(1. 3529) 246. 782 ? E{Yh } ? 253. 168 b. c. F (. 90; 2, 14) = 2. 737, W = 2. 340, no F (. 90; 2, 14) = 2. 737, S = 2. 340, B = t(. 975; 14) = 2. 145 Xh = 30: 229. 6312 ± 2. 45(3. 3385) 222. 470 ? Yh(new) ? 236. 792 Xh = 40: 249. 9750 ± 2. 145(3. 5056) 242. 455 ? Yh(new) ? 257. 495 4. 10. a. F (. 95; 2, 58) = 3. 15593, W = 2. 512342 Xh = 45: 102. 797 ± 2. 512342(1. 71458) 98. 489 ? E{Yh } ? 107. 105 Xh = 55: 90. 8968 ± 2. 512342(1. 1469) 88. 015 ? E{Yh } ? 93. 778 Xh = 65: 78. 9969 ± 2. 512342(1. 14808) 76. 113 ? E{Yh } ? 81. 881 b. c. B = t(. 99167; 58) = 2. 46556, no B = 2. 46556 Xh = 48: 99. 2268 ± 2. 46556(8. 31158) 78. 734 ? Yh(new) ? 119. 720 Xh = 59: 86. 1368 ± 2. 46556(8. 24148) 65. 817 ? Yh(new) ? 106. 457 4-2 Xh = 74: 68. 2869 ± 2. 46556(8. 33742) 47. 730 ? Yh(new) ? 88. 843 d. 4. 12. a. c.

Yes, yes ? Y = 18. 0283X H0 : ? 1 = 17. 50, Ha : ? 1 = 17. 50. M SE = 20. 3113, s{b1 } = . 07948, t? = (18. 0283? 17. 50)/. 07948 = 6. 65, t(. 99; 11) = 2. 718. If |t? | ? 2. 718 conclude H0 , otherwise Ha . Conclude Ha . ? Yh = 180. 283, s{pred} = 4. 576, 180. 283 ± 2. 718(4. 576), 167. 845 ? Yh(new) ? 192. 721 i: 1 2 3 ei : 1. 802 -3. 340 10. 717 i: 7 8 9 ei : -. 849 6. 292 -. 510 No b. H0 : E{Y } = ? 1 X, Ha : E{Y } = ? 1 X. SSLF = 40. 924, SSP E = 182. 500, F ? = (40. 924/8) ? (182. 500/3) = . 084, F (. 99; 8, 3) = 27. 5. If F ? ? 27. 5 conclude H0 , otherwise Ha . Conclude H0 . P -value = . 997 ? Y = 0. 121643X s{b1 } = 0. 0263691, t(. 975; 19) = 1. 9801, 0. 121643 ± 1. 9801(0. 00263691), 0. 116 ? ?1 ? 0. 127 ? ? Yh = 3. 64929, s{Yh } = 0. 0791074, 3. 64929 ± 1. 9801(0. 0791074), 3. 493 ? E{Yh } ? 3. 806 i: 1 2 … ei : 1. 3425 2. 1820 . . . No c. H0 : E{Y } = ? 1 X, Ha : E{Y } = ? 1 X. SSLF = 23. 3378, SSP E = 39. 3319, F ? = (23. 3378/20) ? (39. 3319/99) = 2. 93711, F (. 995; 20, 99) = 2. 22939. If F ? ? 2. 22939 conclude H0 , otherwise Ha . Conclude Ha . P -value = 0. 0002 ? Y = 14. 9472X s{b1 } = 0. 226424, t(. 95; 44) = 1. 68023, 14. 9472 ± 1. 68023(0. 226424), 14. 567 ? ?1 ? 15. 328 ? Yh = 89. 6834, s{pred} = 8. 92008, 89. 6834 ± 1. 8023(8. 92008), 74. 696 ? Yh(new) ? 104. 671 i: 1 2 … ei : -9. 89445 0. 21108 . . . No 4-3 44 1. 2111 45 2. 2639 119 -0. 0863 120 -0. 4580 4 -2. 283 5 6 -2. 396 -4. 708 12 -1. 170 d. 4. 13. a. 10 11 -3. 283 2. 887 4. 14. a. b. c. 4. 15. b. 4. 16. a. b. c. 4. 17. b. c. H0 : E{Y } = ? 1 X, Ha : E{Y } = ? 1 X. SSLF = 622. 12, SSP E = 2797. 66, F ? = (622. 12/9) ? (2797. 66/35) = 0. 8647783, F (. 99; 9, 35) = 2. 96301. If F ? ? 2. 96301 conclude H0 , otherwise Ha . Conclude H0 . P -value = 0. 564 4. 18. No 4. 19. a. b. 4. 20. a. b. ? Xh(new) = 33. 11991, t(. 95; 118) = 1. 657870, s{predX} = 16. 35037, 33. 11991 ± 1. 657870(16. 5037), 6. 013 ? Xh(new) ) ? 60. 227 No, 0. 297453 > . 1 ? Xh(new) = 34. 1137, t(. 995; 14) = 2. 977, s{predX} = 1. 6610, 34. 1137 ± 2. 977(1. 6610), 29. 169 ? Xh(new) ? 39. 058 Yes, . 0175 < . 1 4. 21. Yes, no ? ? ? ? 4. 22. Let A3 denote the event that statement 3 is correct and B the event A1 ? A2 . Then by (4. 2a): ? ? ? ? ? P (B ? A3 ) = P (A1 ? A2 ? A3 ) ? 1 ? 2? ? ? = 1 ? 3? 4. 23. From (4. 13) it follows at once that: Xi (Yi ? b1 Xi ) = Xi ei = 0 4. 24. From Exercise 1. 41c, we have that E{b1 } = ? 1 . Hence: ? E{Y } = E{b1 X} = XE{b1 } = ? 1 X = E{Y }. 2 2 ? 4. 25. ? 2 {Yh } = ? 2 {b1 Xh } = Xh ? 2 {b1 } = Xh (? / 2 ? 2 2 s {Yh } = Xh (M SE/ Xi ). Xi2 ); hence, 4. 26. a. B = t(. 9875; 438) = 2. 24913, b0 = ? 110. 635, s{b0 } = 34. 7460, b1 = 0. 00279542, s{b1 } = 0. 0000483694 ? 110. 635 ± 2. 24913(34. 7460) ? 188. 783 ? ?0 ? ?32. 487 0. 00279542 ± 2. 24913(0. 00004837) 0. 00269 ? ?1 ? 0. 0029 b. c. Yes F (. 90; 2, 438) = 2. 31473, W = 2. 151618; B = t(. 9833; 438) = 2. 13397; Bonferroni Xh = 500: ? 109. 237 ± 2. 13397(34. 7328) ? 183. 356 ? E{Yh } ? ?35. 118 Xh = 1, 000: ? 107. 839 ± 2. 13397(34. 7196) Xh = 5, 000: ? 96. 6577 ± 2. 13397(34. 6143) ? 181. 930 ? E{Yh } ? ?33. 748 ? 170. 524 ? E{Yh } ? ?22. 792 d. 4. 27. a. B = t(. 75; 111) = 1. 982, b0 = 6. 3368, s{b0 } = . 5213, b1 = . 7604, s{b1 } = . 1144 4-4 6. 3368 ± 1. 982(. 5213) 0. 7604 ± 1. 982(. 1144) b. c. d. No 5. 304 ? ?0 ? 7. 370 0. 534 ? ?1 ? 0. 987 F (. 95; 2, 111) = 3. 08, W = 2. 482; B = t(. 99375; 111) = 2. 539; Working-Hotelling Xh = 2: 7. 858 ± 2. 482(. 3098) Xh = 3: 8. 618 ± 2. 482(. 2177) Xh = 4: 9. 378 ± 2. 482(. 1581) Xh = 5: 10. 139 ± 2. 482(. 1697) 7. 089 ? E{Yh } ? 8. 627 8. 078 ? E{Yh } ? 9. 158 8. 986 ? E{Yh } ? 9. 770 9. 718 ? E{Yh } ? 10. 560 4-5 Chapter 5 MATRIX APPROACH TO SIMPLE LINEAR REGRESSION ANALYSIS 5. 1. 2 7 0 1 23 24 1 13 17 22 ? ? ? ? ? ? ? ? (1) ? 3 10 ? (2) ? 2 ? (3) ? 36 40 2 ? (4) ? 20 26 34 ? 5 13 1 3 49 56 3 27 35 46 (5) ? ? ? ? ? ? ? ? ? 9 26 26 76 ? ? ? ? 5. 2. (1) ? 5 9 11 11 ? ? ? 10 8 ? 6 12 14 49 71 76 Y1 Y2 Y3 Y4 22 54 82 80 ? ? ? (2) ? ? ? ? ? ?1 ? 7 ? 5 ? 1 ? ? ? 0 6 ? 2 4 (5) ? ? ? ? ? ? (3) 58 80 (4) ? ? ? ? ? 11 8 20 26 ? ? ? 32 38 ? 28 40 ? 63 94 55 73 5. 3. ? ? (1) ? ? ? ? ? ? ??? ? ? ? ? Y1 ? Y2 ? Y3 ? Y4 ? ? ? ? ? ? =? ? ? ? ? e1 e2 e3 e4 ? ? ? ? ? (2) X1 X2 X3 X4 e1 e2 e3 e4 ? ? ? ? = ? 0 5. 4. 5. 5. 5. 6. (1) 503. 77 (1) 1,259 (1) 2,194 (2) (2) (2) 5 0 0 160 6 17 17 55 10 10 10 20 (3) (3) (3) 49. 7 ? 39. 2 81 261 142 182 5-1 5. 7. 5. 8. (1) 819,499 a. b. c.

Yes 2 0 Yes Yes 2 0 (2) 16 448 448 13, 824 (3) 3, 609 103, 656 5. 9. a. b. c. d. 5. 10. A? 1 = ? ?. 1 . 4 . 3 ?. 2 B? 1 .10870 ?. 08696 . 10870 ? .02174 ?. 15217 ? = ? .34783 ? ?. 23913 . 14130 . 01087 ? ? ? 5. 11. .33088 ?. 15441 ?. 03676 ? .09559 ? ? . 13971 ?. 19853 ? ?. 26471 . 32353 . 02941 . 2 0 0 . 00625 1. 34146 ?. 41463 ?. 41463 . 14634 4 7 2 3 y1 y2 = y1 y2 4. 5 1 y1 y2 0 4 ? 5. 12. 5. 13. 5. 14. a. b. 5. 15. a. b. ? ? ? ? ? ? ? ? ? ? = 25 12 5 2 23 7 y1 y2 ? Y1 ? Y2 ? Y3 ? Y4 ? Y5 ? = 8 28 = 5. 16. ? ? ? ? ? ? ?? ?=Y ? ? ? ? ? ? ? ? ? 1 1 1 1 1 ? ? ? ? ? ? ? ? ? + b1 ? ? ? ? ? ? ? ? X1 ? X ? X2 ? X ? X3 ? X ? X4 ?

X ? X5 ? X ?? ? ? ? ? ? ? ? ? ? 5. 17. a. 1 1 1 Y1 W1 ? ? ? 0 ? ? Y2 ? ? ?? ? W2 ? = ? 1 ? 1 Y3 W3 1 ? 1 ? 1 5-2 b. ?? ?? ? ? W1 ? 1 1 ? ? E ? W2 ? = ? 1 ? 1 ? ? ? ? ? ? ? W3 c. 1 1 1 ? 2 {Y1 } ? {Y1 , Y2 } ? {Y1 , Y3 } ? ?? 2 0 ? ? ? {Y2 , Y1 } ? 2 {Y2 } ? {Y2 , Y3 } ? ? {W} = ? 1 ? 1 ? 1 ? 1 ? 1 ? {Y3 , Y1 } ? {Y3 , Y2 } ? 2 {Y3 } 1 1 1 ? ? ? 1 ? 1 ? 1 ? ? 1 0 ? 1 2 Using the notation ? 1 for ? 2 {Y1 }, ? 12 for ? {Y1 , Y2 }, etc. , we obtain: 2 2 2 ? 2 {W1 } = ? 1 + ? 2 + ? 3 + 2? 12 + 2? 13 + 2? 23 2 2 ? 2 {W2 } = ? 1 + ? 2 ? 2? 12 2 2 2 ? 2 {W3 } = ? 1 + ? 2 + ? 3 ? 2? 12 ? 2? 13 + 2? 23 2 2 ? {W1 , W2 } = ? 1 ? ?2 + ? 3 ? ?23 2 2 2 ? {W1 , W3 } = ? 1 ? ?2 ? ?3 ? 2? 23 2 2 ? {W2 , W3 } = ? 1 + ? 2 ? 2? 12 ? ?13 + ? 23 ? 1 E{Y1 } E{Y1 } + E{Y2 } + E{Y3 } ? 0 ? ? E{Y2 } ? = ? E{Y1 } ? E{Y2 } ?? ? ? ? 1 ? 1 ? 1 E{Y3 } E{Y1 } ? E{Y2 } ? E{Y3 } ?? ? ?? ? ? ? ? ? ? 5. 18. a. W1 W2 E = 1 4 1 2 1 4 1 2 1 4 ? 1 2 1 4 ? 1 2 ? ? ? ? Y1 Y2 Y3 Y4 ? ? ? ? ? b. W1 W2 = 1 4 1 2 1 4 1 2 1 [E{Y1 } 4 1 [E{Y1 } 2 1 4 ? 1 2 + E{Y2 } + E{Y3 } + E{Y4 }] + E{Y2 } ? E{Y3 } ? E{Y4 }] ? ? ? ? ? 1 2 1 2 ? 1 2 ? 1 2 2 c. ? 2 {W} = 1 4 ? 1 2 ? 2 {Y1 } ? {Y1 , Y2 } ? {Y1 , Y3 } ? {Y1 , Y4 } ? {Y2 , Y1 } ? 2 {Y2 } ? {Y2 , Y3 } ? {Y2 , Y4 } ? {Y3 , Y1 } ? {Y3 , Y2 } ? {Y3 } ? {Y3 , Y4 } ? {Y4 , Y1 } ? {Y4 , Y2 } ? {Y4 , Y3 } ? 2 {Y4 } ? ? ? ? ? ? ? ? ? ? ? 1 4 ? 1 ? 4 ?? 1 ? 4 1 4 2 Using the notation ? 1 for ? {Y1 }, ? 12 for ? {Y1 , Y2 }, etc. , we obtain: 1 2 2 2 2 (? 1 + +? 2 + ? 3 + ? 4 + 2? 12 + 2? 13 + 2? 14 + 2? 23 + 2? 24 + 2? 34 ) 16 2 2 2 2 ? 2 {W2 } = 1 (? 1 + ? 2 + ? 3 + ? 4 + 2? 12 ? 2? 13 ? 2? 14 ? 2? 23 ? 2? 24 + 2? 34 ) 4 2 2 2 2 ? {W1 , W2 } = 1 (? 1 + ? 2 ? ?3 ? ?4 + 2? 12 ? 2? 34 ) 8 ? 2 {W1 } = 5. 19. 5. 20. 3 5 5 17 7 ? 4 ? 4 8 5-3 5. 21. 5Y12 + 4Y1 Y2 + Y22 5. 22. Y12 + 3Y22 + 9Y32 + 8Y1 Y3 ? ? ? ? (2) ? ? ? ? 5. 23. a. (1) 9. 940 ?. 245 ?. 18 . 04 . 26 . 08 ?. 20 ? ? ? ? ? ? ? (3) 9. 604 (4) . 148 (5) ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? .00987 0 0 . 000308 . 4 . 3 . 2 . 1 0 . 2 . 2 . 2 . 2 . 2 0 ?. 2 ? .1 0 ? ? . 2 . 2 ? ? . 3 . 4 ? ? . 4 . 6 ? (6) 11. 41 (7) . 02097 c. .6 . 4 . 2 0 ?. 2 d. .01973 ?. 01973 ?. 00987 . 00000 . 00987 ?. 01973 . 03453 ?. 00987 ?. 00493 . 00000 ?. 00987 ?. 00987 . 03947 ?. 00987 ?. 00987 . 00000 ?. 00493 ?. 00987 . 03453 ?. 01973 . 00987 . 00000 ?. 00987 ?. 01973 . 01973 ? ? ? ? ? (2) ? ? ? ? ? ? ? ? ? ? ? ? ? 5. 24. a. (1) .43902 4. 60976 ?2. 8781 ?. 0488 . 3415 . 7317 ? 1. 2683 3. 1219 ? ? ? ? ? ? ? ? ? ? (3) 145. 2073 (4) 20. 2927 (5) b. ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? 6. 8055 ? 2. 1035 ? 2. 1035 . 7424 (2) 6. 8055 (6) 18. 878 (3) . 8616 ? (7) 6. 9290 (1) ? 2. 1035 c. .366 ?. 146 . 024 . 195 . 195 . 366 ?. 146 . 658 . 390 . 122 . 122 ?. 146 ? ? ? .024 . 390 . 268 . 146 . 146 . 024 ? ? . 195 . 122 . 146 . 171 . 171 . 195 ? ? ? .195 . 122 . 146 . 171 . 171 . 195 ? .366 ?. 146 . 024 . 195 . 195 . 366 3. 217 . 742 ?. 124 ?. 990 ?. 990 ? 1. 856 . 742 1. 732 ? 1. 980 ?. 619 ?. 619 . 742 ? ? ? ?. 124 ? 1. 980 3. 712 ?. 742 ?. 742 ?. 124 ? ? ?. 990 ?. 619 ?. 742 4. 207 ?. 866 ?. 990 ? ? ? ?. 990 ?. 619 ?. 742 ?. 866 4. 207 ?. 990 ? ?1. 856 . 742 ?. 124 ?. 990 ?. 990 3. 127 5-4 ? . ? ? ? ? ? ? ? ? ? ? (3) ? ? ? ? ? ? ? ? ? ? 5. 25. a. (1) .2 ?. 1 ?. 1 . 1 (2) 10. 2 4. 0 1. 8 ? 1. 2 ? 1. 2 1. 8 ?. 2 ? 1. 2 ? 2. 2 . 8 . 8 . 8 ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? (4) ? ? ? ? ? ? ? ? ? ? .1 . 1 . 1 . 1 . 1 . 1 . 1 . 1 . 1 . 1 ? .1 . 2 0 . 2 ?. 1 . 1 . 2 . 1 0 . 2 ? ? . 1 0 . 2 0 . 3 . 1 0 . 1 . 2 0 ? ? . 1 . 2 0 . 2 ?. 1 . 1 . 2 . 1 0 . 2 ? ? ? .1 ?. 1 . 3 ?. 1 . 5 . 1 ?. 1 . 1 . 3 ?. 1 ? ? . 1 . 1 . 1 . 1 . 1 . 1 . 1 . 1 . 1 . 1 ? ? ? .1 . 2 0 . 2 ?. 1 . 1 . 2 . 1 0 . 2 ? ? . 1 . 1 . 1 . 1 . 1 . 1 . 1 . 1 . 1 . 1 ? ? . 1 . 0 . 2 0 . 3 . 1 0 . 1 . 2 0 ? ? . 1 . 2 0 . 2 ?. 1 . 1 . . 1 . 0 . 2 . 44 ?. 22 ?. 22 . 22 ? (5) 17. 60 (6) (7) 18. 2 (8) . 44 b. (1) . 22 ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? (2) ?. 22 (3) . 663 0 0 ? ?. 1 . 1 ? ? . 1 ?. 1 ? ? ?. 1 . 1 ? ? ? .2 ?. 2 ? ? 0 0 ? ? ? ?. 1 . 1 ? ? 0 0 ? ? . 1 ?. 1 ? ? ?. 1 . 1 ? c. 0 0 0 0 0 0 0 0 0 . 1 ?. 1 . 1 ?. 2 0 . 1 0 0 ?. 1 . 1 ?. 1 . 2 0 ?. 1 0 0 . 1 ?. 1 . 1 ?. 2 0 . 1 0 0 ?. 2 . 2 ?. 2 . 4 0 ?. 2 0 0 0 0 0 0 0 0 0 0 . 1 ?. 1 . 1 ?. 2 0 . 1 0 0 0 0 0 0 0 0 0 0 ?. 1 . 1 ?. 1 . 2 0 ?. 1 0 0 . 1 ?. 1 . 1 ?. 2 0 . 1 0 5. 26. a. (1) .675000 ?. 021875 ?. 021875 . 00078125 (2) 5-5 168. 600000 2. 034375 ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? (3) ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? (4) ? ? ? ? ? ? 201. 150 201. 150 201. 150 201. 150 217. 425 217. 425 217. 425 217. 425 233. 700 233. 700 233. 700 233. 700 249. 975 249. 975 249. 975 249. 975 . 175 . 175 . 175 . . . ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? .175 · · · ?. 050 ?. 050 ?. 050 . 175 · · · ?. 050 ?. 050 ?. 050 ? ? ? .175 · · · ?. 050 ?. 050 ?. 050 ? ? ? . . . . . . . . ? . . . . ? ?. 050 ?. 050 ?. 050 · · · . 175 . 175 . 175 ? ? ? ?. 050 ?. 050 ?. 050 · · · . 175 . 175 . 175 ? ?. 050 ?. 050 ?. 050 · · · . 175 . 175 . 175 (6) 7. 0598 ?. 2288 ?. 2288 . 008171 (3) . 0904 ? .175 . 75 . 175 . . . ? (5) 146. 425 b. (1) 7. 0598 ? ? ? ? ? ? ? ? ? ? ? ? ? (7) 11. 1453 (2) ?. 2288 c. .825 ?. 175 ?. 175 · · · . 050 . 050 . 050 ?. 175 . 825 ?. 175 · · · . 050 . 050 . 050 ? ? ? ?. 175 ?. 175 . 825 · · · . 050 . 050 . 050 ? ? ? . . . . . . . . . . . . ? . . . . . . ? .050 . 050 . 050 · · · . 825 ?. 175 ?. 175 ? ? ? .050 . 050 . 050 · · · ?. 175 . 825 ?. 175 ? .050 . 050 . 050 · · · ?. 175 ?. 175 . 825 ?? ?? ? ? 0 ? ?1 ? ? ? ? ?? ? ?? ? 0 ? ? E ? 2 ? =? ?=0 ? ? ? ? ?? ?3 ?? ? 0 ? ? ? ? ? ? ? 5. 27. ?4 0 5. 28. Let ? ? ? X=? ? ? X1 X2 . . . Xn ? ? ? ? ? ? 5-6 Then by (5. 60) b = (X X)? 1 X Y = 5. 29 Xi Yi / Xi2 .

E{b} = E{(X X)? 1 X Y} = (X X)? 1 X E{Y} = (X X)? 1 X X? = ? ? 5. 30. Yh = Xh b is a scalar, hence it equals its transpose. By (5. 32) then, Xh b = (Xh b) = b Xh . ? 5. 31. ? 2 {Y} = H? 2 {Y}H = H? 2 IH = ? 2H [by (5. 46)] (since H is symmetric) (since HH = H) 5-7 5-8 Chapter 6 MULTIPLE REGRESSION – I ? 6. 1. a. ? X=? ? ? ? 1 1 1 1 1 1 1 1 X11 X21 X31 X41 X11 X21 X31 X41 X11 X12 X21 X22 X31 X32 X41 X42 X12 X22 X32 X42 ? ? ? ? ? ? ? ? ? ? ?0 ? ? = ? ?1 ? ? ? 2 ? 0 ? ? ? = ? ?1 ? ?2 ? ? ? ? b. ? ? X=? ? ? ? ? ? X=? ? ? ? ? ? ? ? X=? ? ? ? 6. 2. a. X11 X21 X31 X41 X51 1 1 1 1 1 X12 X22 X32 X42 X52 2 X11 2 X21 2 X31 2 X41 2 X51 ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?1 ? ? ? = ? ?2 ? ?3 ? ? ? ? b. X11 X21 X31 X41 X51 log10 X12 log10 X22 log10 X32 log10 X42 log10 X52 ?0 ? ? = ? ?1 ? ? ? 2 6. 5. a. 1. 000 . 892 . 395 Y ? 1. 000 . 000 ? X1 ? ? 1. 000 X2 b. c&d. i: 1 2 ei : ?. 10 . 15 Expected Val. : ?. 208 . 208 3 4 ? 3. 10 3. 15 ? 3. 452 2. 661 5 6 ?. 95 ? 1. 70 ?. 629 ? 1. 533 7 8 ? 1. 95 1. 30 ? 2. 052 1. 533 16 . 60 . 629 ? b0 = 37. 650, b1 = 4. 425, b2 = 4. 375, Y = 37. 650 + 4. 425X1 + 4. 375X2 ? ? i: 9 10 11 12 13 ei : 1. 20 ? 1. 55 4. 20 2. 45 ? 2. 65 Expected Val. : 1. 066 ? 1. 066 4. 764 2. 052 ? 2. 661 6-1 14 15 ? 4. 40 3. 35 ? 4. 764 3. 452 e. SSR? = 72. 41, SSE = 94. 30, XBP = (72. 41/2) ? (94. 30/16)2 = 1. 04, ? 2 (. 99; 2) = 2 9. 21. If XBP ? 9. 21 conclude error variance constant, otherwise error variance not constant. Conclude error variance constant. f. H0 : E{Y } = ? 0 + ? 1 X1 + ? 2 X2 , Ha : E{Y } = ? 0 + ? 1 X1 + ? 2 X2 . M SLF = 7. 46, M SP E = 7. 125, F ? = 7. 46/7. 125 = 1. 047, F (. 99; 5, 8) = 6. 63. If F ? ? 6. 63 conclude H0 , otherwise Ha . Conclude H0 . H0 : ? 1 = ? 2 = 0, Ha : not all ? k = 0 (k = 1, 2). M SR = 936. 350, M SE = 7. 254, F ? = 936. 350/7. 254 = 129. 083, F (. 99; 2, 13) = 6. 70. If F ? ? 6. 70 conclude H0 , otherwise Ha .

Conclude Ha . P -value = 0+ s{b1 } = . 301, s{b2 } = . 673, B = t(. 9975; 13) = 3. 372 4. 425 ± 3. 372(. 301) 4. 375 ± 3. 372(. 673) 3. 410 ? ?1 ? 5. 440 2. 106 ? ?2 ? 6. 644 6. 6. a. b. c. 6. 7. a. b. SSR = 1, 872. 7, SST O = 1, 967. 0, R2 = . 952 . 952, yes. ? ? Yh = 77. 275, s{Yh } = 1. 127, t(. 995; 13) = 3. 012, 77. 275 ± 3. 012(1. 127), 73. 880 ? E{Yh } ? 80. 670 s{pred} = 2. 919, 77. 275 ± 3. 012(2. 919), 68. 483 ? Yh(new) ? 86. 067 Y X1 X2 X3 ? ? ? ? ? 6. 8. a. b. 1. 0000 6. 9. c. .2077 1. 0000 .0600 . 0849 1. 0000 .8106 . 0457 . 1134 1. 0000 ? ? ? ? ? 6. 10. a. b&c. ? Y = 4149. 89 + 0. 000787X1 ? 13. 166X2 + 623. 54X3 i: 1 2 ei : ? 32. 0635 169. 2051 Expected Val. : ? 24. 1737 151. 0325 ? ? n1 = 26, d1 = 145. 0, n2 = 26, d2 = 77. 4, t? = (145. 0 ? BF |t? | ? 2. 67779 BF … … … 51 ? 184. 8776 ? 212. 1315 52 64. 5168 75. 5358 e. s = 81. 7, 77. 4)/[81. 7 (1/26) + (1/26)] = 2. 99, t(. 995; 50) = 2. 67779. If conclude error variance constant, otherwise error variance not constant. Conclude error variance not constant. 6. 11. a. H0 : ? 1 = ? 2 = ? 3 = 0, Ha : not all ? k = 0 (k = 1, 2,3). M SR = 725, 535, M SE = 20, 531. 9, F ? = 725, 535/20, 531. 9 = 35. 337, F (. 95; 3, 48) = 2. 79806. If F ? ? 2. 79806 conclude H0 , otherwise Ha .

Conclude Ha . P -value = 0+. s{b1 } = . 000365, s{b3 } = 62. 6409, B = t(. 9875; 48) = 2. 3139 0. 000787 ± 2. 3139(. 000365) 623. 554 ± 2. 3139(62. 6409) ? .000058 ? ?1 ? 0. 00163 478. 6092 ? ?3 ? 768. 4988 6-2 b. c. 6. 12. a. SSR = 2, 176, 606, SST O = 3, 162, 136, R2 = . 6883 F (. 95; 4, 48) = 2. 56524, W = 3. 2033; B = t(. 995; 48) = 2. 6822 Xh1 302, 000 245, 000 280, 000 350, 000 295, 000 Xh2 7. 2 7. 4 6. 9 7. 0 6. 7 Xh3 0: 0: 0: 0: 1: 4292. 79 ± 2. 6822(21. 3567) 4245. 29 ± 2. 6822(29. 7021) 4279. 42 ± 2. 6822(24. 4444) 4333. 20 ± 2. 6822(28. 9293) 4917. 42 ± 2. 6822(62. 4998) 4235. 507 ? E{Yh } ? 4350. 073 4165. 623 ?

E{Yh } ? 4324. 957 4213. 855 ? E{Yh } ? 4344. 985 4255. 606 ? E{Yh } ? 4410. 794 4749. 783 ? E{Yh } ? 5085. 057 b. Yes, no 6. 13. F (. 95; 4, 48) = 2. 5652, S = 3. 2033; B = t(. 99375; 48) = 2. 5953 Xh1 230, 000 250, 000 280, 000 340, 000 Xh2 7. 5 7. 3 7. 1 6. 9 Xh3 0: 0: 0: 0: 4232. 17 ± 2. 5953(147. 288) 4250. 55 ± 2. 5953(146. 058) 4276. 79 ± 2. 5953(145. 134) 4326. 65 ± 2. 5953(145. 930) 3849. 913 ? Yh(new) 3871. 486 ? Yh(new) 3900. 124 ? Yh(new) 3947. 918 ? Yh(new) ? 4614. 427 ? 4629. 614 ? 4653. 456 ? 4705. 382 6. 14. a. b. ? Yh = 4278. 37, s{predmean} = 85. 82262, t(. 975; 48) = 2. 01063, ? 4278. 37 ± 2. 01063(85. 2262), 4105. 812 ? Yh(new) ? 4450. 928 12317. 44 ? Total labor hours? 13352. 78 Y X1 X2 X3 ? 6. 15. b. 1. 000 ?. 7868 ?. 6029 ?. 6446 ? 1. 000 . 5680 . 5697 ? ? ? ? ? ? 1. 000 . 6705 ? 1. 000 ? c. d&e. ? Y = 158. 491 ? 1. 1416X1 ? 0. 4420X2 ? 13. 4702X3 i: 1 2 … ei : . 1129 ? 9. 0797 . . . Expected Val. : ? 0. 8186 ? 8. 1772 . . . 45 ? 5. 5380 ? 5. 4314 46 10. 0524 8. 1772 f. g. No 2 SSR? = 21, 355. 5, SSE = 4, 248. 8, XBP = (21, 355. 5/2) ? (4, 248. 8 /46)2 = 2 2 1. 2516, ? (. 99; 3) = 11. 3449. If XBP ? 11. 3449 conclude error variance constant, otherwise error variance not constant. Conclude error variance constant. 6. 16. . H0 : ? 1 = ? 2 = ? 3 = 0, Ha : not all ? k = 0 (k = 1, 2, 3). M SR = 3, 040. 2, M SE = 101. 2, F ? = 3, 040. 2/101. 2 = 30. 05, F (. 90; 3, 42) = 2. 2191. If F ? ? 2. 2191 conclude H0 , otherwise Ha . Conclude Ha . P -value = 0. 4878 b. s{b1 } = . 2148, s{b2 } = . 4920, s{b3 } = 7. 0997, B = t(. 9833; 42) = 2. 1995 ? 1. 1416 ± 2. 1995(. 2148) ?. 4420 ± 2. 1995(. 4920) ? 13. 4702 ± 2. 1995(7. 0997) ? 1. 6141 ? ?1 ? ?0. 6691 ? 1. 5242 ? ?2 ? 0. 6402 ? 29. 0860 ? ?3 ? 2. 1456 6-3 c. 6. 17. a. b. 6. 18. b. SSR = 9, 120. 46, SST O = 13, 369. 3, R = . 8260 ? ? Yh = 69. 0103, s{Yh } = 2. 6646, t(. 95; 42) = 1. 6820, 69. 0103 ± 1. 820(2. 6646), 64. 5284 ? E{Yh } ? 73. 4922 s{pred} = 10. 405, 69. 0103 ± 1. 6820(10. 405), 51. 5091 ? Yh(new) ? 86. 5115 Y 1. 0000 ?. 2503 . 4138 . 0665 . 5353 ? ? X1 ? 1. 0000 . 3888 ?. 2527 . 2886 ? ? ? X2 ? 1. 0000 ?. 3798 . 4407 ? ? ? X3 ? 1. 0000 . 0806 ? ? ? X4 1. 0000 ? Y = 12. 2006 ? .1420X1 + . 2820X2 + 0. 6193X3 + 0. 0000079X4 i: 1 ei : ? 1. 0357 Expected Val. : ? 1. 1524 No ? ? n1 = 40, d1 = 0. 8696, n2 = 41, d2 = 0. 7793, s = 0. 7357, t? = (0. 8696 ? BF 0. 7793)/0. 7357 (1/40) + (1/41) = 0. 5523, t(. 975; 79) = 1. 9905. If |t? | ? 1. 9905 BF conclude error variance constant, otherwise error variance not constant.

Conclude error variance constant. H0 : ? 1 = ? 2 = ? 3 = ? 4 = 0, Ha : not all ? k = 0 (k = 1, 2, 3, 4). M SR = 34. 5817 M SE = 1. 2925, F ? = 34. 5817/1. 2925 = 26. 7557, F (. 95; 4, 76) = 2. 4920. If F ? ? 2. 4920 conclude H0 , otherwise Ha . Conclude Ha . P -value = 0+ s{b1 } = . 02134, s{b2 } = . 06317, s{b3 } = 1. 08681, s{b4 } = . 00000138, B = t(. 99375; 76) = 2. 5585 ?. 1420 ± 2. 5585(. 02134) . 2820 ± 2. 5585(. 06317) . 6193 ± 2. 5585(1. 08681) . 0000079 ± 2. 5585(. 00000138) ?. 1966 ? ?1 ? ?. 0874 . 1204 ? ?2 ? .4436 ? 2. 1613 ? ?3 ? 3. 3999 . 0000044 ? ?1 ? .0000114 2 … ?1. 5138 . . . ?1. 5857 . . . 80 ? 2. 302 ? 1. 9321 81 ?. 9068 ? 1. 0407 ? ? c. d&e. f. g. 6. 19. a. b. c. SSR = 138. 327, SST O = 236. 5576, R2 = . 5847 Xh2 8. 25 8. 50 11. 50 10. 25 Xh3 0 . 23 . 11 0 Xh4 250, 000: 270, 000: 300, 000: 310, 000: 6. 20. F (. 95; 5, 76) = 2. 3349, W = 3. 4168; B = t(. 99375; 76) = 2. 5585 Xh1 5 6 14 12 15. 7981 ± 2. 5585(. 2781) 16. 0275 ± 2. 5585(. 2359) 15. 9007 ± 2. 5585(. 2222) 15. 8434 ± 2. 5585(. 2591) 15. 087 ? E{Yh } ? 16. 510 15. 424 ? E{Yh } ? 16. 631 15. 332 ? E{Yh } ? 16. 469 15. 180 ? E{Yh } ? 16. 506 6. 21. t(. 975; 76) = 1. 9917 Xh1 4 6 12 Xh2 10. 0 11. 5 12. 5 Xh3 0. 10 0 . 32 Xh4 80, 000: 120, 000: 340, 000: 15. 485 ± 1. 9917(1. 1528) 15. 5425 ± 1. 9917(1. 1535) 16. 9138 ± 1. 9917(1. 1946) 12. 852 ? Yh(new) ? 17. 445 13. 245 ? Yh(new) ? 17. 840 14. 535 ? Yh(new) ? 19. 293 6-4 85 percent 6. 22. a. b. c. d. e. 6. 23. a. Yes 2 No, yes, Yi = loge Yi = ? 0 + ? 1 Xi1 + ? 2 Xi2 + ? i , where ? i = loge ? i Yes No, no No, yes, Yi = loge (Yi? 1 ? 1) = ? 0 + ? 1 Xi1 + ? i Q = (Yi ? ?1 Xi1 ? ?2 Xi2 )2 ? Q = ? 2 (Yi ? ?1 Xi1 ? ?2 Xi2 )Xi1 ?? 1 ? Q = ? 2 (Yi ? ?1 Xi1 ? ?2 Xi2 )Xi2 ?? 2 Setting the derivatives equal to zero, simplifying, and substituting the least squares estimators b1 and b2 yields: Yi Xi1 ? b1 Yi Xi2 ? 1 and: b1 = b2 = 2 Yi Xi2 Xi1 Xi2 ? Yi Xi1 Xi2 2 2 ( Xi1 Xi2 )2 ? Xi1 Xi2 2 Xi1 ? b2 Xi1 Xi2 = 0 2 Xi2 = 0 Xi1 Xi2 ? b2 b. 2 Yi Xi1 Xi1 Xi2 ? Yi Xi2 Xi1 2 2 ( Xi1 Xi2 )2 ? Xi1 Xi2 n 1 1 v L= exp ? 2 (Yi ? ?1 Xi1 ? ?2 Xi2 )2 2? i=1 2?? 2 It is more convenient to work with loge L: n 1 loge L = ? loge (2?? 2 ) ? 2 (Yi ? ?1 Xi1 ? ?2 Xi2 )2 2 2? ? loge L 1 = 2 (Yi ? ?1 Xi1 ? ?2 Xi2 )Xi1 ?? 1 ? ? loge L 1 = 2 (Yi ? ?1 Xi1 ? ?2 Xi2 )Xi2 ?? 2 ? Setting the derivatives equal to zero, simplifying, and substituting the maximum likelihood estimators b1 and b2 yields the same normal equations as in part (a), and hence the same estimators. Q = (Yi ? ?0 ? ?1 Xi1 ? ?2 Xi1 ? ?3 Xi2 )2 ? Q 2 = ? 2 (Yi ? ?0 ? ?1 Xi1 ? ?2 Xi1 ? ?3 Xi2 ) ?? 0 ? Q 2 = ? 2 (Yi ? ?0 ? ?1 Xi1 ? ?2 Xi1 ? ?3 Xi2 )Xi1 ?? 1 ? Q 2 2 = ? 2 (Yi ? ?0 ? ?1 Xi1 ? ?2 Xi1 ? ?3 Xi2 )Xi1 ?? 2 6. 24. a. 6-5 ?Q 2 = ? 2 (Yi ? ?0 ? ?1 Xi1 ? ?2 Xi1 ? ?3 Xi2 )Xi2 ?? 3 Setting the derivatives equal to zero, simplifying, and substituting the least squares estimators b0 , b1 , b2 , and b3 yields the normal equations: Yi ? nb0 ? b1 Yi Xi1 ? b0 2 Yi Xi1 ? b0 Xi1 ? b2 Xi1 ? b1 2 Xi1 ? b1 2 Xi1 ? b3 2 Xi1 ? b2 3 Xi1 ? b2 Xi2 = 0 3 Xi1 ? b3 4 Xi1 ? b3 Xi1 Xi2 = 0 2 Xi1 Xi2 = 0 b.