Begin of page section: Content
How big is Big Data? Currently we produce 2.5 Exabytes (2,5 billion Gigabytes) of new information on a daily basis, and that number will increase to 44 Zettabytes (44 billion Terabytes) in the next three to four years. That is a lot of information, and it's growing faster and faster. "Data is the oil of the 21st century", some say. Large corporate information silos accumulate data like oil companies extract fossil fuels. Unlike with oil, there wont be a "peak data" moment.
Much of this data is personal. People share it in exchange for "free" services like social networking, search engines, video platforms and email services. The trust in large corporations is mainly expressed by consenting to terms of services, which are rarely even read, much less understood.
It's a hackers paradise: large amounts of data in centralized servers; outdated software which can be easily exploited; more and more devices and infrastructure is connected to the Internet with "always on" as a fatal paradigm. Software is cracked, systems are hacked, data is leaking: not just to Wikileaks, but to a myriad of players, from small hacker groups to large intelligence services. No system connected to the Internet is safe, not even governments, infrastructure or companies servicing the surveillance industrial complex.
What's next? Is it possible to escape this enormous accumulation of personal data? Is encryption on a large scale a way to protect the human right to privacy? Or should we disconnect from the status quo, literally?
Sunday, 05 Mar
19:30 - 21:00
Forum Stadtpark
Moderation: Daniel Erlacher (Elevate)
#e17hacks
End of this page section.
Skip to overview of page sections.