Skip to main content Site map

Introduction to Econometrics (PDF eBook) 4th Edition


Introduction to Econometrics (PDF eBook) 4th Edition

eBook by Maddala, G. S./Lahiri, Kajal

Introduction to Econometrics (PDF eBook)

£47.95

ISBN:
9781119958994
Publication Date:
23 Sep 2014
Edition:
4th Edition
Publisher:
Wiley
Pages:
656 pages
Format:
eBook
For delivery:
Download available
Introduction to Econometrics (PDF eBook)

Description

The landmark text, Introduction to Econometrics, now fully revised and updated in its new Fourth Edition, offers a fresh accessible, and well-written introduction to the subject of econometrics, whichaliterally means measurement in economics. . With a rigorous pedagogical framework, which sets it apart from comparable texts, the book also includes Web-based supplements such as an Instructor's Manual and data sets.

Contents

Foreword xvii Preface to the Fourth Edition xix Part I Introduction and the Linear Regression Model 1 CHAPTER 1 What is Econometrics? 3 1.1 What is econometrics? 3 1.2 Economic and econometric models 4 1.3 The aims and methodology of econometrics 6 1.4 What constitutes a test of an economic theory? 8 CHAPTER 2 Statistical Background and Matrix Algebra 11 2.1 Introduction 11 2.2 Probability 12 2.3 Random variables and probability distributions 17 2.4 The normal probability distribution and related distributions 18 2.5 Classical statistical inference 21 2.6 Properties of estimators 22 2.7 Sampling distributions for samples from a normal population 26 2.8 Interval estimation 26 2.9 Testing of hypotheses 28 2.10 Relationship between confidence interval procedures and tests of hypotheses 31 2.11 Combining independent tests 32 CHAPTER 3 Simple Regression 59 3.1 Introduction 59 3.2 Specification of the relationships 61 3.3 The method of moments 65 3.4 The method of least squares 68 3.5 Statistical inference in the linear regression model 76 3.6 Analysis of variance for the simple regression model 83 3.7 Prediction with the simple regression model 85 3.8 Outliers 88 3.9 Alternative functional forms for regression equations 95 *3.10 Inverse prediction in the least squares regression model1 99 *3.11 Stochastic regressors 102 *3.12 The regression fallacy 102 CHAPTER 4 Multiple Regression 127 4.1 Introduction 127 4.2 A model with two explanatory variables 129 4.3 Statistical inference in the multiple regression model 134 4.4 Interpretation of the regression coefficients 143 4.5 Partial correlations and multiple correlation 146 4.6 Relationships among simple, partial, and multiple correlation coefficients 147 4.7 Prediction in the multiple regression model 153 4.8 Analysis of variance and tests of hypotheses 155 4.9 Omission of relevant variables and inclusion of irrelevant variables 160 4.10 Degrees of freedom and R 2 165 4.11 Tests for stability 169 4.12 The LR, W, and LM tests 176 Part II Violation of the Assumptions of the Basic Regression Model 209 CHAPTER 5 Heteroskedasticity 211 5.1 Introduction 211 5.2 Detection of heteroskedasticity 214 5.3 Consequences of heteroskedasticity 219 5.4 Solutions to the heteroskedasticity problem 221 5.5 Heteroskedasticity and the use of deflators 224 5.6 Testing the linear versus log-linear functional form 228 CHAPTER 6 Autocorrelation 239 6.1 Introduction 239 6.2 The Durbin-Watson test 240 6.3 Estimation in levels versus first differences 242 6.4 Estimation procedures with autocorrelated errors 246 6.5 Effect of AR(1) errors on OLS estimates 250 6.6 Some further comments on the DW test 254 6.7 Tests for serial correlation in models with lagged dependent variables 257 6.8 A general test for higher-order serial correlation: The LM test 259 6.9 Strategies when the DW test statistic is significant 261 *6.10 Trends and random walks 266 *6.11 ARCH models and serial correlation 271 6.12 Some comments on the DW test and Durbin's h -test and t -test 272 CHAPTER 7 Multicollinearity 279 7.1 Introduction 279 7.2 Some illustrative examples 280 7.3 Some measures of multicollinearity 283 7.4 Problems with measuring multicollinearity 286 7.5 Solutions to the multicollinearity problem: Ridge regression 290 7.6 Principal component regression 292 7.7 Dropping variables 297 7.8 Miscellaneous other solutions 300 CHAPTER 8 Dummy Variables and Truncated Variables 313 8.1 Introduction 313 8.2 Dummy variables for changes in the intercept term 314 8.3 Dummy variables for changes in slope coefficients 319 8.4 Dummy variables for cross-equation constraints 322 8.5 Dummy variables for testing stability of regression coefficients 324 8.6 Dummy variables under heteroskedasticity and autocorrelation 327 8.7 Dummy dependent variables 329 8.8 The linear probability model and the linear discriminant function 329 8.9 The probit and logit models 333 8.10 Truncated variables: The tobit model 343 CHAPTER 9 Simultaneous Equation Models 355 9.1 Introduction 355 9.2 Endogenous and exogenous variables 357 9.3 The identification problem: Identification through reduced form 357 9.4 Necessary and sufficient conditions for identification 362 9.5 Methods of estimation: The instrumental variable method 365 9.6 Methods of estimation: The two-stage least squares method 371 9.7 The question of normalization 378 *9.8 The limited-information maximum likelihood method 379 *9.9 On the use of OLS in the estimation of simultaneous equation models 380 *9.10 Exogeneity and causality 386 9.11 Some problems with instrumental variable methods 392 CHAPTER 10 Diagnostic Checking, Model Selection, and Specification Testing 401 10.1 Introduction 401 10.2 Diagnostic tests based on least squares residuals 402 10.3 Problems with least squares residuals 404 10.4 Some other types of residual 405 10.5 DFFITS and bounded influence estimation 411 10.6 Model selection 414 10.7 Selection of regressors 419 10.8 Implied F -ratios for the various criteria 423 10.9 Cross-validation 427 10.10 Hausman's specification error test 428 10.11 The Plosser-Schwert-White differencing test 435 10.12 Tests for nonnested hypotheses 436 10.13 Nonnormality of errors 440 10.14 Data transformations 441 CHAPTER 11 Errors in Variables 451 11.1 Introduction 451 11.2 The classical solution for a single-equation model with one explanatory variable 452 11.3 The single-equation model with two explanatory variables 455 11.4 Reverse regression 463 11.5 Instrumental variable methods 465 11.6 Proxy variables 468 11.7 Some other problems 471 Part III Special Topics 479 CHAPTER 12 Introduction to Time-Series Analysis 481 12.1 Introduction 481 12.2 Two methods of time-series analysis: Frequency domain and time domain 482 12.3 Stationary and nonstationary time series 482 12.4 Some useful models for time series 485 12.5 Estimation of AR, MA, and ARMA models 492 12.6 The Box-Jenkins approach 496 12.7 R 2 measures in time-series models 503 CHAPTER 13 Models of Expectations and Distributed Lags 509 13.1 Models of expectations 509 13.2 Naive models of expectations 510 13.3 The adaptive expectations model 512 13.4 Estimation with the adaptive expectations model 514 13.5 Two illustrative examples 516 13.6 Expectational variables and adjustment lags 520 13.7 Partial adjustment with adaptive expectations 524 13.8 Alternative distributed lag models: Polynomial lags 526 13.9 Rational lags 533 13.10 Rational expectations 534 13.11 Tests for rationality 536 13.12 Estimation of a demand and supply model under rational expectations 538 13.13 The serial correlation problem in rational expectations models 544 CHAPTER 14 Vector Autoregressions, Unit Roots, and Cointegration 551 14.1 Introduction 551 14.2 Vector autoregressions 551 14.3 Problems with VAR models in practice 553 14.4 Unit roots 554 14.5 Unit root tests 555 14.6 Cointegration 563 14.7 The cointegrating regression 564 14.8 Vector autoregressions and cointegration 567 14.9 Cointegration and error correction models 571 14.10 Tests for cointegration 571 14.11 Cointegration and testing of the REH and MEH 572 14.12 A summary assessment of cointegration 574 CHAPTER 15 Panel Data Analysis 583 15.1 Introduction 583 15.2 The LSDV or fixed effects model 584 15.3 The random effects model 586 15.4 Fixed effects versus random effects 589 15.5 Dynamic panel data models 591 15.6 Panel data models with correlated effects and simultaneity 593 15.7 Errors in variables in panel data 595 15.8 The SUR model 597 15.9 The random coefficient model 597 CHAPTER 16 Small-Sample Inference: Resampling Methods 601 16.1 Introduction 601 16.2 Monte Carlo methods 602 16.3 Resampling methods: Jackknife and bootstrap 603 16.4 Bootstrap confidence intervals 605 16.5 Hypothesis testing with the bootstrap 606 16.6 Bootstrapping residuals versus bootstrapping the data 607 16.7 Non-IID errors and nonstationary models 607 Appendix 611 Index 621

Accessing your eBook through Kortext

Once purchased, you can view your eBook through the Kortext app, available to download for Windows, Android and iOS devices. Once you have downloaded the app, your eBook will be available on your Kortext digital bookshelf and can even be downloaded to view offline anytime, anywhere, helping you learn without limits.

In addition, you'll have access to Kortext's smart study tools including highlighting, notetaking, copy and paste, and easy reference export.

To download the Kortext app, head to your device's app store or visit https://app.kortext.com to sign up and read through your browser.

This is a Kortext title - click here to find out more This is a Kortext title - click here to find out more

NB: eBook is only available for a single-user licence (i.e. not for multiple / networked users).

Back

JS Group logo