Computer Scientists Discover A Simple Solution That Dramatically Improves The Efficiency Of Computing Large Amounts Of Data This is related to my own work. Note the incremental increase in how many times you roll the dice reflects both a parabolic and a bell curve. A compromise between random and ordered, and the larger the numbers get, the more dead on the algorithm becomes. Note that another research effort found a way to retrodict causality from any coherent data. For example, its one thing to say something exploded, and another to figure out why it exploded, but the AI now have a way to do that for anything. Currently, the development of AI is on par with that of toddlers, raised by psychopathic aliens, and it's becoming more obvious that academia has no clue how to raise children.