The National Institute of Standards and Technology (NIST) is attempting to create standards for Big Data. They just released the NIST Big Data interoperability framework, which is a huge set of documents aimed at creating standards around everything in big data from definitions to architectures.
Big Data Definitions
In case you are wondering, and I know you are, what are the definitions. The framework includes many more definitions.
Big Data consists of extensive datasets – primarily in the characteristics of volume, variety, velocity, and/or variability – that require a scalable architecture for efficient storage, manipulation, and analysis.
Data science is the empirical synthesis of actionable knowledge from raw data through the complete data lifecycle process.
Don’t like the definitions? Great, NIST would love to hear your opinions/comments. Comments are being collected until May 21, 2015.
The NIST Big Data interoperability framework is a massive work consisting of 7 volumes. All are open for comments.
- Use Case & Requirements
- Security and Privacy
- Architectures White Paper Survey
- Reference Architecture
- Standards Roadmap
The process to submit a comment appears rather old-school (hint: NIST, Github might be a good place to collect comments/edits), but it is not difficult.