M
MGaluzzi
Jun 12, 2014

Is too much data a problem for big data?

In response to an EFF lawsuit and subsequent requests for specific data, the NSA says that it can’t stop deleted the data it is wanted in the court case because it claims it’s systems are so complex. Clearly the NSA is dealing with a lot more data than most, if not all, companies, but is the pure complexity and volume of big data its Achilles’ heel?

jimlynch
06/18/2014
You may find some of these TED Talks interesting:

Playlist: Making sense of too much data
https://www.ted.com/playlists/56/making_sense_of_too_much_data
T
TravisT
06/12/2014

That’s the challenge of big data - taking a massive amount of data and applying analytics to extract useful, actionable information. Keep in mind, I’m not a data scientist, but analysts sometimes refer to the “Three Vs of Big Data”:  data volume, data velocity and data type variety. Obviously volume is one of these and is consider a critical component of data analytics. Of course, the greater the variety and volume, the more challenging it can be to work with it. 

 

With respect to the NSA, I’m not buying it. They have a history of not being honest, even when the director is testifying to congress. They also have a history of denying that things are possible, only to have it come out later that not only was it possible, they were actually doing it. Things like intercepting Google’s network traffic comes to mind, for an example. I firmly suspect they are destroying the data that is requested because they don’t want to provide it, not because they lack the technological know-how to do so. Keep in mind that just a week or two ago, in response to an ACLU request to a Florida police department about cell phone spying, just before the records were to be produced, the US Marshal Service deputized one of the local police officers as a “special deputy marshal” then used that as a basis to claim that all records were property of the federal government and removed the records from the jurisdiction.

Answer this