The Semantic Web As Big Brother Could Have Prevented The BP Oil Disaster
The Semantic Web is coming and an oil disaster like the one we're experiencing in the Gulf might help move up its timetable. Since Web 3.0's semantic technology is predicated on everything in our physical world being tagged with its own Internet address - once in place - any body will be able to monitor any type of operation via the Web - even off-shore drilling.
W. David Stephenson, principal of Stephenson Strategies (aka The Public Data Guy) asserts that a new Web-based "Regulation 3.0" technology could have actually prevented the BP offshore oil tragedy, and that it needs to be in place in advance of future oil rig deficiencies.
Conjuring up images of 'Big Brother,' Stephenson argues that "Regulation 3.0" would give federal and state regulators direct, real-time access to the exact same data that the companies have access to. Taking this premise further, he believes the tactic could easily be "applicable to other regulatory controversies such as oversight of the TARP loans by banks."
In 2007, Stephenson introduced a series of 21st Century disaster tips pertaining to hurricanes using personal communication devices.
The way Regulation 3.0 would will work in the wild is based on what's called "structured data," where all 'things' in our real world are tagged with a bar code. Bar codes using the eXtensible Markup Language (XML) will allow for continuous access from any place on the Web - which means it doesn't have be manually updated or pasted anywhere - the data automatically and instantaneously flows anywhere those same tags are inserted.
Applying the combination of structured data and “Internet of Things” to the BP disaster could have allowed monitoring to every part of the rig’s safety system in real-time. It could have revealed the battery, defective blowout protector and other problems contributing to the failure. A procedure could have been designed using the system so regulators could have automatically shut down the rig when it failed the pressure test, rather than leaving that decision up to BP.
Stephenson who is an award-winning environmental crisis manager believes this Big Brother approach needs to be based on 'transparency' because BP has shown it cannot be trusted to make these types of decisions on their own - particularly when it has far-reaching global ramifications for the ecology and those that make their living off of the sea.
Instead, Stephenson is suggesting a "Don't Trust Us, Track Us" strategy in which regulators will have "unfettered access to the data (that) would help earn (back) the confidence," that BP has lost.
According a Federal Computer Week release, Stephenson said the foundation for Regulation 3.0 is already in place. The SEC has begun a phased-in program requiring all publicly-traded companies to file their reports using XBRL, a business-oriented XML subset that is currently being extended to include oil-industry-specific tags. Companies already using XBRL for SEC reporting will get more benefit and amortize that expense if they begin to use for these types of preventative purposes.
So, in an age where privacy is an issue under close scrutiny due to Facebook's recent transgressions, Regulation 3.0, while it services the 'greater good,' might find just as much resistance as Mark Zuckerberg's Open Graph. Or perhaps, this approach needs to exist in a world that occasionally needs regulation to protect those that can't protect themselves? Web 3.0 and semantic technology gets closer and closer with each passing day. It will be interesting to see if Stephenson's Regulation 3.0 passes muster so we never have to experience another oil disaster like the one we're living through presently.
Your thoughts? Too much government intervention? Or a necessary evil?