In recent years, how we analyze data has changed dramatically. With the advent of personal computers and the internet, the sheer volume of data we have available has grown enormously. Companies have terabytes of data about the consumers they interact with, and governmental, academic, and private research institutions have extensive archival and survey data on every manner of research topic. Gleaning information (let alone wisdom) from these massive stores of data has become an industry in itself. At the same time, presenting the information in easily accessible and digestible ways has become increasingly challenging.
The science of data analysis (statistics, psychometrics, econometrics, and machine learning) has kept pace with this explosion of data. Before personal computers and the internet, academic researchers developed new statistical methods that they published as theoretical papers in professional journals. It could take years for programmers to adapt these methods and incorporate them into the statistical packages that were widely available to data analysts. Now new methodologies are appearing daily. Statistical researchers publish new and improved methods, along with the code to produce them, on easily accessible websites.