Take cancer. Curing it is hard because cancer is not one disease, but many. Tumors can be triggered by a dizzying array of causes, and they mutate as they metastasize. The surest way to kill a tumor is to sequence its genome, figure out which drugs will work against it-without harming you, givenyour genome and medical history-and perhaps even design a new drug specifically for your case. No doctor can master all the knowledge required for this. Sounds like a perfect job for machine learning: in effect, itвЂ™s a more complicated and challenging version of the searches that Amazon and Netflix do every day, except itвЂ™s looking for the right treatment for you instead of the right book or movie. Unfortunately, while todayвЂ™s learning algorithms can diagnose many diseases with superhuman accuracy, curing cancer is well beyond their ken. If we succeed in our quest for the Master Algorithm, it will no longer be.. Otherwise, if the center square is free, play there.. The second thing is that machine learning is a sword with which to slay the complexity monster. Given enough data, a learning program thatвЂ™s only a few hundred lines long can easily generate a program with millions of lines, and it can do this again and again for different problems. The reduction in complexity for the programmer is phenomenal. Of course, like the Hydra, the complexity monster sprouts new heads as soon as we cut offthe old ones, but they start off smaller and take a while to grow, so we still get a big leg up.. Where are we headed?. CHAPTER THREE: HumeвЂ™s Problem of Induction. If gene C is expressed, gene D is not.. DarwinвЂ™s algorithm. KozaвЂ™s confidence stands out even in a field not known for its shrinking violets. He sees genetic programming as an invention machine, a silicon Edison for the twenty-first century. He and other evolutionaries believe it can learn any program, making it their entry in the Master Algorithm sweepstakes. In 2004, they instituted the annual Humie Awards to recognize вЂњhuman-competitiveвЂќ genetic creations; thirty-nine have been awarded to date.. You can probably tell just by looking at this plot that the main street in Palo Alto runs southwest-northeast. You didnвЂ™t draw a street, but you can intuit that itвЂ™s there from the fact that all the points fall along a straight line (or close to it-they can be on different sides of the street). Indeed, the street is University Avenue, and if you want to shop or eat out in Palo Alto, thatвЂ™s the place to go. Asa bonus, once you know that the shops are on University Avenue, you donвЂ™t need two numbers to locate them, just one: the street number (or, if you wanted to be really precise, the distance from the shop to the Caltrain station, on the southwest corner, which is where University Avenue begins).. Recall that a Markov network is defined by a weighted sum of features, much like a perceptron. Suppose we have a collection of photos of people. We pick a random one and compute features of it likeThe person has gray hair, The person is old, The person is a woman, and so on. In a perceptron, we pass the weighted sum of these features through a threshold to decide whether, say, the person is your grandmother or not. In a Markov network, we do something very different (at least at first sight): we exponentiate the weighted sum, turning it into a product of factors, and this product is the probability of choosing that particular picture from the collection, regardless of whether your grandmother is in it. If you have many pictures of old people, the weight of that feature goes up. If most of them are of men, the weight ofThe person is a woman goes down. The features can be anything we want, making Markov networks a remarkably flexible way to represent probability distributions.. Evolution, part 2. Many people worry that human-directed evolution will permanently split the human race into a class of genetic haves and one of have-nots. This strikes me as a singular failure of imagination. Natural evolution did not result in just two species, one subservient to the other, but in an infinite variety of creatures and intricate ecosystems. Why would artificial evolution, building on it but less constrained, do so?. Thanks for letting me be your guide. IвЂ™d like to give you a parting gift. Newton said that he felt like a boy playing on the seashore, picking up a pebble here and a shell there while the great ocean of truth lay undiscovered before him. Three hundred years later, weвЂ™ve gathered an amazing collection of pebbles and shells, but the great undiscovered ocean still stretches into the distance, sparkling with promise. The gift is a boat-machine learning-and itвЂ™s time to set sail.. Prologue. Introduction to Statistical Relational Learning,* edited by Lise Getoor and Ben Taskar (MIT Press, 2007), surveys the main approaches in this area. My work with Matt Richardson on modeling word of mouth is summarized inвЂњMining social networks for viral marketingвЂќ (IEEE Intelligent Systems, 2005)..