Just attended the “Governing Algorithms” conference at NYU, and my mind is buzzing with ideas. I may add a recap of the speakers to this post later, but right now, I just want to get an idea out that was suggested by Paul Dourish’s presentation, in which he suggested we think about “ecosystems of algorithms.”
How would we map such an ecosystem? Algorithms are usually studied either individually (e.g.; the algo that determines whether or not you trade a particular stock) or vertically in combination with the programmer, data, software, hardware, network, and final purpose to which it is put. What would it mean to study these algos as they interact with each other and with data?
For example, the AP Twitter Hack wrought havoc with the stock market because of interacting algos: the algo that authenticated the Twitter account erroneously, the algos that monitored the AP feed for alarming keywords, and the algos that run the high-frequency trades. (And not for nothing, but the more I learn about HFT, the more I think Frank Herbert was prescient when he wrote “The Tactful Saboteur.”)
An algo that runs on a really huge dynamic data set will not only find new (previously unknowable) patterns, but it may also produce data itself – on which other algos will run. Methodologically, should we try to map these as more-or-less horizontal two-mode networks? And what are the theoretical implications of this (especially for security)?
UPDATE: and what happens when there is an “internet of things”?
May 22nd, 2013 5:20pm networksecosystemsalgorithms