Birth of the Internet and Big Data: Useful Side-Effects of CERN Scientists Going Big (Video)
Scientists trying to decypher the fundamental laws and building blocks that constitute our universe at CERN in Switzerland had to go big in terms of giant devices and hyper-charged technology required to achieve their ambitious goals. They succeeded. 50 years onwards, some of the largest and most sophisticated machines ever built by man yielded experimental prove to the complex theories devised by physicists, things like antimatter and most recently the Higgs-Boson, and accompanying scientific awards -- like Nobel Prizes.
The technological tools required and developed along the way also include Big Data tools (supercomputers and software) and the internet, with the first transatlantic data connection and the world's first web server and brower in Geneva, allowing the generation and exchange of research results.
This video, created for the TEDxCERN event earlier this year, shows the solutions that CERN has developed to handle big data — both today and over previous decades.
"As the rallying call of an era, 'big data' encompasses a multitude of diverse domains, but all share the potential to make a huge impact on society," says Tim Smith, who wrote and narrated the lesson. "Whether the focus is on collecting, preserving, sharing, or combining, these domains all ultimately aim to unlock big data's power through data mining and analysis. Through this, they are striving to put useful information at the fingertips of citizens, services, and businesses the world over."
On 28-29 September, members of the public will have the chance to visit CERN in Geneva, Switzerland. There are even opportunities for visitors to venture underground to see the Large Hadron Collider (LHC) and the enormous particle detectors used by CERN's various physics experiments.
However, there's also plenty of exciting things to see above ground, including the CERN Data Centre, which is at the heart of the Worldwide LHC Computing Grid, used by researchers across the globe to analyze the tens of petabytes of data generated by the organization's experiments each year.