The New Kind of Social Science (NKSS) Project began in 2002, when Valerie Hudson of Brigham Young University asked Philip Schrodt, then of the University of Kansas at Lawrence (now at Penn State) whether he had read Stephen Wolfram's then-new book entitled A New Kind of Science. The two had known each other for almost two decades, dating back to the time when they were both teaching international relations at Northwestern University. Though Hudson and Schrodt had been trained in standard quantitative methods of analysis, each had explored other types of methodologies that overcame some of the inherent limitations of the standard approach, including methods such as computational modeling and rule-based production systems.
Hudson and Schrodt were intrigued by Wolfram's contention that statistical and mathematical methods were but a small and constrained subset of the analytical power possessed by the human mind. Schrodt had written a book manuscript on pattern recognition in the social sciences, which emphasized this very point. The issue of analyzing complex systems is one that has stymied social science methodology, and in this regard Wolfram (who was not interested in social science) also asserted that most complexity in nature was actually the product of small sets of rules operating on an initial set of conditions. Wolfram was able to generate very complex patterns through iterations of sets of two or three or four simple rules.
If the human mind evolved to identify patterns, and to find meaning in patterns through the act of imputing the rules that created them, then this applies not only to human understanding of the natural world, but also to human behavior itself. Humans use rules to create meaningful patterns of behavior and communication in the social world--and they look for patterns in the behavior and communication of other humans to which they can then impute rules and thus meaning.
Given that the social world contains a dizzying variety of noisy time streams of behavior and communication, standard statistico-mathematical techniques cannot be expected to be as powerful as the human mind that evolved to develop just such analytic powers of pattern recognition and rule imputation. Might it be possible to tap into those evolved capabilities, and refine and facilitate them, in order to analyze complex time streams in a replicable and falsifiable manner? And, per Wolfram, might the observed complexity be the product of a relatively small subset of rules? Reversing Wolfram's approach, can one examine complexity and impute the set of rules that created it?
NKSS believes all three questions can be answered in the affirmative.
To that end, with the help of programmer Ray D. Whitmer, we have constructed a web-based tool to facilitate pattern recognition and rule imputation in complex time stream data. And then we have used that tool to study real-world behavior among nation-states, using what we have termed "discrete sequence rule (DSR) modeling."
A short animated presentation of the overall idea can be found here.