Personal Data Control to Stop the Creepy Factor

(This is a repost –> first posted at the Kantara Initiative)

Regarding the My Data Vs Open Data topic, a constructive place to start may be with the premise that people need to have control over their personal information.   For Open Data to not be creepy its use with personal information needs to be based on personal preference and control.  At the moment we have very little control of personal data (even though we have the right to control our own data) and this is a tremendously big and important issue at this time!

Without this personal data control Open Data, IOT and emerging sensor driven Smart Spaces start to get very creepy!

There are a lot of things that prevent personal information control at the moment.  A major challenge is the fact that privacy policies and terms of services are closed each company has custom policies and private infrastructure for the management of personal information (and our own personal preferences).  This needs to change!

In addition, companies lock up personal data so its difficult and virtually unusable contextually preventing people from asserting preferences, making use of open data or using new IOT devices.

Most significantly, Privacy laws were made when there was no digital infrastructure for people to control their own information.  This means the laws are focused on “old school” concepts of Data Protection and control of your data, instead of the other way around which is data protection for self-control and use of My Data. (Privacy by Design Data Protection not Disempowering Data Protection Law that is in place today)

We are now on the precipice of Smart Spaces (spaces with sensors that are aware of people).   Sensors are being built into everything, our things are being hooked up to the internet and the need for personal information control to protection personal freedom is very apparent.

Smart Spaces are a great case study for discussion as they illustrate the convergence of IOT, Open Data, and Personal Information Control (PIC)  in practical ways both online and off.   Smart Spaces will thrive on Value for People, Control of Personal Information and Notice

With Android & IOS 7 building in support for Smart Bluetooth sensor discovery the reality of ubiquitous sensors to make Smart Spaces usable by people is here and will start to be everywhere in 2014.

We, in the Open Notice community and Kantara Information Sharing Work Group, are creating a specification for an Open Notice Consent Tag, with the aim of making it into a standard for the systematic discovery of  policies and terms on and offline.  The aim is Open Notice will open up the controls for personal data and with it we can all start making Smart Notices for Smart Spaces. 

In fact, another topic of great importance is that of interoperability.  Open Notice will standardize the discovery of Open Data and IOT in Smart Spaces so that personal data, and its control, is held by each individual.  (To Stop the Creepy Factor)

Open Notice, and the discovery of Open Data big and small, is something that should be championed/sponsored by the Open Knowledge Foundation, Open IOT, Open Rights, and Open Notice. As it will take a community to converge and make a standard we can all use.

Why I wouldn’t define myself as a privacy expert

I recently lamented the fact that I don’t  think of myself as a privacy expert.  I know a heck of alot about privacy, studied it in university.  I went so far as to study  the anthropology of CCTV signs in London UK. Became CIPP certified as a privacy professional for information technology.  Yet the thought of being called a privacy expert today is becoming more and more un attractive.

The reason why, the term privacy itself no longer has the same meaning,  in fact its starting to mean the opposite of what it meant when I was first inspired  to study privacy.

Socially, privacy was used an authoritative term to describe a physical state a person enjoys, a socially expression of the need for this ‘private’ space in various aspects of life.

This led to the development of privacy principles and eventually led to the enactment of privacy laws.  Today, these laws are not being enforced.   The term privacy has been used to describe data protection where even my own data is protected from me.

Most privacy professionals work with enterprise to manage and control personal information in ways that respect the laws and enhance operations.

What we currently know as privacy law is in terms of the archaic mainframe/client database orientated legislation that inspired how the laws were originally enacted which

Today, modern privacy is more about personal  information control.  Not  Data Protection, which is currently doing the opposite of what it is suppose to. When someone says Privacy there is no distinction between these two terms.

The term privacy is  further confused as commercial use of the privacy policies, does not mean that people have privacy protections.  Today, this is used to often explain what privacy people do not have.

In todays times we need to control our own information and have data protection refer to the protections individuals have personally to control information.   Its against privacy principles (around the world) to take more information than is required from people, but this principle is not used administratively or enforced by regulators.

We need personal information controls based on the principles of privacy which are entrenched in law that enable people to manage there own personal information.

The system of passwords and profiles is a broken infrastructure that needs to be advanced.  Profiles and the need for passwords hinder data portability, provides a loop hole for the unnecessary collection of personal information.

How did all of this happen?
Short answer: Closed Privacy Policies and Terms of Service

The profiles we make for each service provided to us on the internet is used in conjunction with unusable, closed privacy policies and terms of services,  which contravene the privacy principles  (in every jurisdiction)  limiting our access to, and control over,   personal information.

What is worse, privacy as an authoritative method for basic human respect is being turned into somewhat of a negative term especially in light of the privacy industry which has formed to support enterprise.  An industry that is  now focused on supporting and entrenching data protection and commercial control of personal information rather than basic respect for human space.

In effect, the opposite of what privacy as an authoritative term meant and the opposite of what I want to be associated as an expert in.

Hundreds of US companies make false data protection claims

Safe Harbour is unsafe.. Need something a bit more trust worthy.From -

STRASBOURG – Hundreds of US-based companies handling EU citizens’ data have lied about belonging to a data protection arrangement known as the Safe Harbour Framework.

Christopher Connolly, a director at Galexia, an Australian-based consulting company on internet law and privacy, told the European Parliament’s civil liberties committee on Monday (7 October) that “many claims of Safe Harbour membership are false.”

He said around one out of every seven Safe Harbour claims of membership are bogus.

The Safe Harbour agreement, hammered out in 2000 between the European Commission and the US Department of Commerce, is supposed to ensure that firms follow EU data protection laws when processing the personal data of EU citizens.

Just under 3,000 companies have signed up to the self-certification scheme, which is only enforceable once the company makes a promise to adhere to a handful of privacy principles.

Companies are also entitled to limit the scope to cover only human resource data, or consumer data, or just offline data.

Galaxia research found over 200 false claims in 2008. This had increased to 427 in September 2013.

“In those 427 organisations, you will find large household names in Europe, with hundreds of millions of customers,” Connolly said.

He added that some of the companies place unauthorised Safe Harbour seals and logos on their website without ever having signed up to the framework in the first place.

The unauthorised visual symbols often have the word ‘EU’ or the European flag on the seal.

“These are simply very low quality and false representations of the actual membership of the Safe Harbour,” Connolly noted.

Over 10 percent of companies that make a false claim of Safe Harbour membership display the US department of commerce Safe Harbour logo on their website.

Privacy advocates have for years asked the Federal Trade Commission (FTC), which enforces Safe Harbour, to address the false claims but with little success.

The FTC has filed six cases of false claims against minor companies and did not sanction any of them.

Around 30 percent of all companies do not provide any information on dispute resolution options, contrary to the Safe Harbour rules. Others who display resolution options point to agencies that charge thousands of dollars to file a complaint.

Over 460 members cite the American Arbitration Association as their dispute resolution provider, which charges the person filing the complaint between $120 and $1,200 per hour with a four-hour minimum charge plus a $950 administration fee.

Meanwhile, Safe Harbour has no provisions to stop NSA-type activities from snooping on EU citizens.

Financial records, data records, travel records, and data and voice carried by US telecommunications providers are excluded from Safe Harbour jurisdiction.

“It would be dangerous to rely on Safe Harbour to manage any aspect of the specific national security issue we face now without first addressing the broader issue of false claims and non-compliance,” Connolly said.

The European Commission, for its part, said it is possible the agreement contains loopholes.

It noted, a few weeks after former NSA agent turned whistleblower Edward Snowden leaked secret documents to the Washington Post and the Guardian, that US data protection standards are lower than in the EU.

“The Safe Harbour agreement may not be so safe after all,” said EU commissioner for Justice Viviane Reding in July.

The commission is set to come out with an assessment report on Safe Harbour before the end of the year.

The FTC, for its part, says the agreement ensures the safe transfer of data of EU citizens.

“We think it is a great way for us to protect European citizens when we are doing a case involving a US company,” FTC commissioner Julie Brill told reports in Brussels in March.

Modern Privacy: Personal Information Control

In regards to a few separate discussions this week, it is becoming increasingly apparent that privacy has changed.  In light of PRISM, Data Protection as a privacy industry and a privacy law focal point is falling apart.

Recently I started to engage privacy and research discussions from the context that: “Modern privacy is about personal information control and  less and less about what is now considered data protection.”  In fact, a few great discussions at the Horizon Digital Roundtable revealed how difficult a deep dive into how we define privacy and the what is construed as access rights to data need a refresh.

Some of the questions that came out of this I put forward here.

The big one i have is; What would the world look like if a person was the data controller for their own personal information and it was provisioned  to companies?

This is a key topic that needs discussion. I have heard this topic come up in a number of different ways over the last few weeks and in fact mentioned a lot in the context of VRM for many years now.

In the context of an individual being the data controller for their own personal information.  How is personal information control a privacy by design approach to information sharing and identity management?

This followed on with my sentiment that.

For instance: 

Does a person in control of sharing personal information with a company make the company  more privacy complaint?

How does a company who takes data from a person in control of information sharing show that they are more compliant and trustworthy than a company that stalks people and keeps big unnecessary profiles?

Can personal information control be a cheaper way for companies to be compliant with new laws and show how good they are to customers?

Is personal information control, in conjunction with services aimed to open notices and policies, be the answer for international transfers of personal information?  (A New Safe Harbour?)

Rethinking and defining Access to Data.

The access and correction data protection and privacy laws were written in a time when people didn’t have the means to provide their own identity and data.  Not only do most people have face book accounts, but, people are able to keep their own data in their own data stores, alway accurate and correct, and provision this on an attribute by attribute basis.   This would be a very big deal with impact on privacy and security, as people would be able to provision their own data with limited amount of identity attributes, for specific purpose (as termed in law) as oppose to the current, everything is pubic data model..  Of course this information would still require data protection but the control and management of personal information can be outsourced to the individual.   Data minimisation as a common practice would be an an attainable reality.

The entrenchment of data protection law and the requirement for people make accounts and share their identity with every service and company is no longer a tenable rationale for personal information gathering in today’s information age.

As a result, I would like to argue that personal information control dramatically changes the compliance landscape.  First of all, privacy is discussed in terms of data protection where an entire industry of privacy professionals are employed to advise on data protection when data protection is increasingly inefficient and broken, many people, employee’s have access to protected data.  Data is tapped by different governments with no respect for jurisdiction.  People have their information tracked aggregated and sold and aggregated.  Data protection as the trust model for information control and privacy becomes increasingly less plausible.

As an alternative, companies that took this data would easily show a higher compliance level then companies who took their own copy of your data.    Data minimisation, could be augmented by User Managed Access to data. Overall, people would be in much better control of personalisation and the context to which their personal information is used.

What is more important, is that the precepts that we hold as constant in privacy and data protection need to be re thought and redefined.

In privacy law there is a common provision about ‘access to data’ (My Data) that a company holds about a data subject. Today this is a persons profile (or part of a profile), created in order to identify oneself to consent to terms.  People have the right of access to data, and in the EU in 2016 there will be the provision to the right of data portability.  Not only to correct and amend it data portability will formalise the personal information control tools so people can provision their own data to companies.  Rather than keep track of many accounts and passwords and terms and privacy policies, we can keep track of our own data and companies can be provisioned.

No longer is it a question of when people will become in control and of provisioning personal data, but it is the how. Also, under which terms will personal data be controlled.  Customer Commons is working on this right now and there is an issue that is now on the radar.

In such a scenario people are the point of integration.  In terms of open data, people can integrate ubiquitous sensor data, or open data from companies with their own information.  At the root of this is the terms, conditions, privacy policies and the administrative processes people need to use to control their own information.

Open Notice is an effort the says that closed policies and terms of services are no longer tenable.  When people provide consent, and personal profiles they should be able to also point to their own personal profile which they control. With out this discussion Facebook is becoming the defacto personal data storage architecture.   Before the discussion of a standard set of terms becomes relevant, in many jurisdictions the basic control for personal information control already exist.

So, how do we discuss the variation to the privacy theme, not discuss data protection but discuss information control.  What is the best way to look at the economic performance of policies if we can use them to control personal information?  How do we illustrate to the lobbyist s and regulators that if privacy policies and terms were opened then it would be cheap and profitable for companies to be compliant with data protection laws.

Bottom line, since PRISM data protection and the current data protection regime is falling apart faster than ever.  Safe Harbour, and current laws need to be re-interpreted in terms of information control possible today and not a legacy of data protection that are no longer relevant.

As a result Open Notice effort has spawned a minimal viable consent tag specification as a work item in the ISWG at Kantara.   The intention with this is to provide a common structure for listing policies and relevant information pertaining to a consent so that the marketplace has a common point to start managing policy from.

A consent tag will be inherently extensible and is intended to provide companies with a platform to provide value and personal information control to people.


New report on NSA/PRISM and EU DP for European Parliament

Caspar Bowden has obviously worked very hard to provide a thorough report to the European Parliament.

This report follows an excellent talk that was captured on video How to Wiretap the Cloud

This report  provides an overview of the main legal gaps, loopholes and controversies of these programmes and their differing consequences for the rights of American and EU citizens. The section unravels the legal provisions governing US surveillance programmes and further uncertainties in their application, such as:

- serious limitations to the Fourth Amendment for US citizens

- specific powers over communications and personal data of “non-US persons”;

- absence of any cognizable privacy rights for “non-US persons” under FISA

The section also shows that the accelerating and already widespread use of Cloud computing further undermines data protection for EU citizens, and that a review of some of the existing and proposed mechanisms that have been put in place to protect EU citizens’ rights after data export, actually function as loopholes.


Rally Against the relaxation of informed consent: Save Freedom

Listening to John Bowman, Head of EU and International Data Protection Policy, UK Ministry of Justice speak at an IAPP knowledge net in London was quite revealing.

His update  of the EU Data Protection laws revision brings to light a lot of drama in developing the new data protection directive.

The industry lobby has responded to Data Protection with reports that these new revisions will cost  industry billions. Clearly, it’s the battle for control of Data Subjects information that will cost billions, not the revisions. The economic analysis from industry comes from the perspective that the companies are the controllers of personal information and not the Data Subject. Long developing initiatives like those found in the Vendor Relationship Management community (VRM)  would not only reduce the costs of data control, but open the market for personalisation and contextual control of information creating immense economic benefits.  But of course this will challenge the dominance of industries that profit from not having these controls themselves

Again, the question arises, who is actually representing the Data Subject in the modernisation of privacy controls?

A recent article by Simon Davies from Privacy International blasts the farce that these lobbyist have turned the DP revision into, pointing at political tactics being used to undermine the process of this directive dramatically.

The industry is a closed minded and self-protecting group, needless to say the last re-write of the EU DP revisions has been under the purview of the Irish Presidency. The overt intent is to find a balance of compromises. Although, with a bit of industry knowledge the clear footprint of large internet companies trying to loosen the requirements for people to consent is clear;

- they have tried to balance the needs of data controllers and small business with data protection.

- (Put in review) reduce the imposition on Data controllers.

- The criterion for valid consent would shift from ‘explicit’, under the Commission’s proposal, to ‘unambiguous’ in the Irish Presidency’s text

- The need for ‘informed’ consent is also relaxed from the requirement to provide the information requirements set out in Article 14 to the minimal requirement that the data subject is ‘at least’ made aware of: (i) the identity of the data controller and (ii) the purpose(s) of the processing of their personal data (Recitals 33 and 48).

Data controllers could obtain a single written consent to multiple processing activities, provided clear and distinguishable notice of each different processing activity was given.

A provision that very much has the mark of a Google lobbyist whom, without implicit consent, have illegally aggregate its users data from multiple sources and reused it for advertising profits.

- data controllers are not required to provide fair processing notices where the data are collected from publicly available sources (Article 14a(4)(c)).

- one of the interesting bits in the re- balance is the Co-regulatory regime

“The proposed EU Data Protection Regulation also foresees drafting of codes of conduct covering various sectors, and allows them to be submitted to Authorities, which may give an opinion as to whether they are “in compliance with the Regulation” (Article 38(2)).

Article 29 establishing “data protection certification mechanisms and of data protection seals and marks”, is encouraged, though the legal effect of such recognition needs to be clarified.

With such provisions a standard for consent can be established and then be used in just such a co-regulatory scheme.  A new era of personal data control by the individual has the promise of greatly reducing the costs of data protection compliance by enabling Data Subjects to be themselves agents of compliance.

The costs of compliance can greatly be further reduced if the data subject is the data controller.  In fact, many communities and initiatives around personal data control can attest to not only economic performance of data controller regulation proposed but how this will infact open a market for innovation and economic growth through personalisation and self-advertising.

For cost effectiveness, innovation, and compliance the data subject should be the data controller where and whenever possible. Information should be fed to the data subject about the use of their digital profiles, rather than controlled by intermediaries who sell out the data subject, keeping the data on who is interested and why behind the antiquataed regulation that is today’s data protection regulation. e.g. Facebook

The bottom: the data subject as a data controller is a topic that needs a much more prominent position in the regulators revision of EU Privacy Law.

The Next Privacy Battle: Cameras That Judge Your Every Move

Tarun WadhwaTarun Wadhwa, Contributor (

In Tampa, Florida, just outside of the building where the Republican National Convention is taking place, vigilant observers are perched high above, working day and night to spot suspicious activity. They are not police officers—they are surveillance camerasequipped with “behavior recognition” technology that constantly studies each person to determine whether he or she is the next security threat.  By “learning” patterns of behavior, these devices can monitor large crowds to alert authorities, within seconds, when something out of the ordinary occurs.

High-tech security measures might be expected at large politically charged gatherings. But cameras capable of real-time, sophisticated data mining are starting to appear everywhere.



It may soon be the case that it is no longer necessary to have a human being actively monitoring the screens. Computers will be able to do a better job and for a fraction of the cost. Legal protections from surveillance cameras currently focus on where a camera can be placed.  This will shift to what types of analysis the camera is capable of performing, and for what purpose.

The reason for the quick adoption of these cameras is simple: human beings are not good at attentively watching large amounts of video for very long.  In the United States, it is estimated that there are 30 million surveillance cameras, which create more than 4 billion hours of footage every week.  At best only a small portion of this footage will ever be reviewed. London, for example, has close to 500,000 surveillance cameras. But this has only helped police in solving three percent of all street robberies.

Instead of trying to solve crimes after they have happened, advances in camera technology can spot problems as they are occurring. On Liberty Island, home to one of the nation’s most famous landmarks, surveillance camera data are brought together and analyzed in order to spot when somebody abandons a bag or tries to stay on the island after hours. This technology can alert police to the appearance of an imminent fight.  Across the Bay, in Manhattan, surveillance cameras can track a person’s general description. If there is a report about a suspicious person wearing a red shirt, for example, every person wearing a red shirt in sight of any of their thousands of cameras can be displayed together—in an instant.

It’s not just law enforcement that has taken note of this.  Retail outlets such as Macys, Babys ‘R’ Us, and CVS have installed systems in some of their stores that can spot shoppers who do unusual things—such as remove many items from a shelf at once, open a case that is normally locked, or walk suspiciously through the aisles. Pathmark grocery stores have implemented similar technology that will quickly alert managers of potential shoplifting and employee fraud—as it takes place.

These systems are programmed to assume that everybody is a potential shoplifter, terrorist, or criminal. In addition to issues related to presumption of innocence, this raises many questions about privacy. The idea of a person closely watching our movements is unsettling.  Does it “feel” different if it’s just a computer rather than a human being?

WikiLeaks cables released earlier this month revealed widespread use by local and federal agencies in the U.S. of TrapWire, a technology that aggregates incident reports and camera feeds to try to detect potential terrorist threats.  Understandably, there was uproar over the lack of public disclosure. These same features are being used in other parts of the world to combat dissent. In China, security cameras are commonly used to count the number of people in crosswalks.  These alert the authorities if a crowd forms at an unusual time—which could be sign of unsanctioned protest. Around the world, companies like Sony, Kraft, and Adidas are also installing cameras to target ads to consumers based on their physical features.

The last two decades have largely settled the question of where a security camera can be placed. The promise of increased safety has trumped the right to remain anonymous. In the near future, not having behavioral detection systems present will be seen as a danger and liability, especially as the cost of monitoring technology drops and advanced surveillance becomes even more affordable.

So far, there has been little consequence to this because nothing is usually done with the footage.  But that is going to change. There will, undoubtedly, be concerns arising related to how these datasets can be combined with personally identifiable information to track not only our locations and activities, but our feelings and state. You can expect these to be the next privacy battles in the courts. One would expect the Republicans—who often consider themselves to be the defenders of free speech and liberty—to lead the charge against these technologies.

Meanwhile, back at the Convention in Tampa, cameras have been working overtime alongside police officers to make sure that things run smoothly. If the protests turn violent, as they did at the 2008 Convention in St. Paul, the authorities will now know when and where to react.   It will be interesting to see how the need for domestic security will be balanced against individual rights and our need for privacy.

EU regulators side with Microsoft in IE10′s ‘Do Not Track’ controversy

European regulators have urged an Internet standards-setting body to let Microsoft set users’ preferences for the “Do Not Track” privacy feature in the upcoming Internet Explorer 10 (IE10).

Clearly, this is an incredible point of public-privacy – policy.    It seems obvious that people should have control of being identified by default.  This would then increase the value of their participation, autonomy.  Etc.



Facebook Futurism- The Future of Your Face

Minority Report Style Advertising, the type of advertising in which a shop senses your face and gives you an advertisement based on facial recognition.

This is the future of a 100 billion Facebook IPO.  Now we have a big powerful company that needs to exponentially please investors and is irrevocably moving down the path of capturing the face of your children. Yes, those pictures, the ones  lovingly displayed by parents on the Facebook plaform.

In 2012, the world is hurtling down the path of integrating more and more advanced sensors into all the things we have.  There are many, fridges, stoves, pets, stairs, all of which  are becoming an incredible opportunity for sensing more and more data.  For Facebook, this eventually means targeted advertising based on facial recognition on a street near you!

Facebook, provides a freely accessible service that sells advertising to support it’s free service.  Meaning that the people, or your face, is the product.  Now, with an un precedented 100 Billion IPO, Facebook is developing the free smart phone based on advertising. This will provide Facebook with the ability to come with you everywhere, auctioning real time advertising opportunities to its cusomters, so that the advertising platform can interact with the environment to greet the Facebook user at every turn.  Everywhere!!

Where will this lead? This will in-extricably enable Facebook into your private home.  By 2017, this will lead to custom advertising where even your children’s faces will be ‘sensed’ and their favorite candies will appear as a child level advertising platform in front of their eyes as they enter the corner store.

Nothing will be sacred, all of your space is for sale and Facebook owns your image.  Why else would Facebook pay $1billion dollars for Instagram, the picture taking application?  Just this week, Facebook has bought  The Israeli facial recognition company in its fast and furious drive towards turning your face into a name tag, with inevitable launch of a (most likely free) Facebook smart phone to its almost 1 Billion Users around the globe.

To what end?  Cheap Facebook Fridges?  The exploitation of your toaster.  We shall see.

Welcome to the future of free services.   And a – Thanks for your Face!!



Viviane Redding On The New EU Rules

Informative Interview with Viviane Redding, covering briefly the contentious issues of the right to be forgotten and data breach.

She explains that Identity Trust is a priority as people are finding out weeks later, by chance, that they have been robbed and that this is unacceptable.

Also, People’ own their own information, if they want to take it from Facebook and put it with another company, we have the right to do this!

Check out the brief video for more detail!!

Viviane Redding on New EU Rules