Kalev will be speaking at the 2014 Wolfram Data Summit, presenting "What It Takes to Compute on the Whole World." The Wolfram Data Summit is:
A high-level gathering of innovators in data science, creators of connected devices, and leaders of major data repositories. Established as a forum for leaders of the world's great data repositories, the Wolfram Data Summit has become an annual event for those interested in the latest innovations in data and data science. The Summit delivers big ideas, challenging debates, and unparalleled opportunities to meet and exchange insights with notable participants representing a broad spectrum of interests and industries.
Kalev's talk will present a cross-section of his experiences and lessons learned from leading some of the world's largest societal-scale "big data" projects:
What does it take to build a system that monitors the entire world, constructing a real-time global catalog of behavior and beliefs across every country, connecting every person, organization, location, count, theme, news source, and event across the planet into a single massive ever-evolving real-time network capturing what's happening around the world, what its context is and who's involved, and how the world is feeling about it, every single day? What does it look like to construct a semantic network over nearly the entirety of the world’s socio-cultural academic literature about Africa and the Middle East dating back half a century? Or constructing the same network, but over the entire web itself stretching back two decades? How do you visualize networks with hundreds of millions of nodes, tease structure from chaotic real-world observations, or explore networks in the multi-petabyte range? How do you process and geographically visualize the emotion of Twitter in real-time? How do you rethink sentiment mining from scratch to power a flagship new reality television show? How do you adapt systems to work with machine translation, OCR, closed captioning error, the digital divide, and the messiness of real-world global data? How do you process half a million hours of television news, two billion pages of historic books, or the images of half a billion pages stretching back half a millennia? Most intriguingly, how can the world’s largest computing platforms allow us to uncover the fundamental mathematical patterns of global human life? This talk will survey a cross-section of my latest projects of the past year, offering glimpses into some of the greatest challenges and opportunities of the big data revolution and how it is reshaping the way we understand the world around us.