big data

The Value of Data in the Digital Age

To date, data is being valued and priced by everyone EXCEPT the creators of that content — YOU. If we want to change that many things need to happen, but it begins with taking the time to figure out how a person values their own data. So let’s dissect and see if we can shed some light on this idea.

First, this process is VERY unique, because for the first time EVER, every single person on the planet has the potential to sell a product. Second, instead of being a “consuming” culture and propelling the corporate world forward, human beings are the ones in a position to profit. Third, everyone’s value judgement on data is unique, personal and unquestionable. Fourth, the opportunties for people to enrich themselves in a world of possible technological unemployment is tremendously important to the welfare of society. Finally, on top of the social ramifications, there is the obvious moral ramifications. As highlighted by the misuse of your data by corprorations, this idea of individual data ownership is morally correct.

Now we are not talking about ALL data. If you use Amazon’s services to buy something. All of the information that you create by searching, browsing and buying on the Amazon site, also belongs to them. So while I can opine on the “value” of individual data, I am certain that the legal questions around data are just beginning to be sorted out.

So with all of that in mind, let’s examine how each individual person may value the data that they can provide. Noteworthy to this discussion is that every individual has a different value function. Different people will value different things about their data. So it is vital that we appreciate that each person will price things uniquely.

However the parameter that they weigh can be summarized in a few key variables which are covered below. So lets create a list and explain each one:

  1. Risk of Breach — Each data item, if fallen into the wrong hands can cause harm. This is the risk of breach. This risk will be perceived differently based upon the reputation for safety of the data user, a perceived sense of associated insurance and the context of the data itself. For example, let’s consider 4 tiers of risk of breach. Tier 1 ( HARMLESS) — the contents of my dishwasher. This data might have value to someone and could not harm me if used nefariously. Tier 2 (LKELY HARMLESS)— the contents of my refrigerator. Still like to be unable to hurt me, but since people may know what I consume, one could they possibly tamper with it. Tier 3 (HARMFUL — ONLY INCONVENIENT) Examples here might include, financial breach. Where often the risk is not only yours, there is a form of insurance (bank insurance or other similar example), but it is dangerous and painful when it occurs. Tier 4 (HARMFUL — PERSONAL SAFETY) Examples here might include your exact location, your health records, access to your cybernetics and/or your genetic code.
  2. Risk of Privacy — How sensitive or personal do we view the data items. On this risk, I beleive that pricing is rather binary or maybe parabolic. Many data items which we can produce do not make us concerned if made public. That is, until a line is crossed, where we consider them private. My pictures, fine. My shared moments, fine. My bedroom behavior, private. So when that line is crossed, the price of the associated data rises substantially. To continue the example, a manufacturer of an everyday item, such as pots and pans, may not have to pay a privacy premium for data associated with our cooking habits. However, a manufacturer of adult toys, may have to pay a substantial premium to gain access to the bedroom behavior of a meaningful size sample of data. This is a good time to remember that these pricing mechanisms are personal, true microeconomics and everyone will value the risk of privacy differently. Even to the point where the example I just gave may be completely reversed. Bedroom behavior, no problem… but keep your eyes of of my cooking.
  3. Time — how easy is it to generate the data. Can I generate the data simply by existing? That data will be cheaper. Do I have to engage in a use of my time to create the data, that data will be more expensive. Would you like me to watch your commercials? more expensive. Would you like me to fill out your survey? 2 questions is cheaper than 20 questions. Time is also a function about the entire mechanism for creating, monitoring the data.
  4. Applicability — is the data being asked of me relevant to me. This is a question of “known” versus “unknown”. If I regularly drink a certain type of coffee, I am more likely to accept coupons, ads, sales and promotions from my coffee shop than I am from the Tea emporium around the corner. The function here is inverted, as the applicability decreases, the value of access to “me” increases from my perspective. That is not to say that it also increases for the data consumer, so with respect to applicability we have typically juxtaposed supply and demand curves. Also, if you only value data based on the supply side (what I am willing to give), then you miss out on revenue opportunities by allowing people access to your attention to “broaden your exposure”.

If the world changes to a personal data driven model, then the corporate world and the artificial intelligence world, will have to learn how to examine these key variables. The marketplace where these transactions will occur MUST be a robust mechanism for price discovery whereby many different bids and offers are being considered on a real-time basis to determine the “price/value” of data This is why I have proposed the Personal Data Exchange as a mechanism for identifying this value proposition. Exchanges are in the business of price discovery, on behalf of their listed entities, in this case, “you”.

Personal Data — How to Control Info and Get Paid for Being You
Here’s a quick and easy way to break some of the current monopolies that exist in the personal data market (looking at…medium.com

In the end, this is the morally corrected position. For a variety of reasons it a justifiable and necessary change to a marketplace that was created largerly without your consent. Recent changes to the law, such as GDPR in Europe have begun to claw back the rights of the indidivual. But if we can get this done, it becomes a complete gamechanger. Please… get on board. Your thoughts and critiques are welcome and encouraged, ForHumanity.

Facebook and Cambridge Analytica - "Know Your Customer", A Higher Standard

Know Your Customer (KYC) is a required practice in finance. Defined as the process of a business identifying and verifying the identity of its clients. The term is also used to refer to the bank and anti-money laundering regulations which governs these activities. Many of you will not be familiar with this rule of law. It exists primarily in the financial industry and is a cousin to laws such as Anti-Money Laundering (AML) and the Patriot Act of 2001 and US Freedom Act of 2015. These laws were designed to require companies to examine who their clients were. Are they involved in illegal activities? Do they finance terrorism? What is the source of these monies? Does the customer engage in illegal activity? The idea was to prevent our financial industry from supporting or further the ability of wrong-doers to cause harm. So how does this apply to Facebook and the Cambridge Analytica issues?

I am suggesting that the Data Industry, which includes any company that sells or provides access to individual information should be held to this same standard. Facebook should have to Know Your Customer. Google should have to Know Your Customer. Doesn’t this seem reasonable? The nice part about this proposal is that it isn’t new. We don’t have to draft brand new laws to cover it, rather just modify some existing language. KYC exists for banks, now let’s expand it to social media, search engines and the sellers of big data.

Everywhere in the news today, we have questions about “who is buying ads on social media”? Was it Russians trying to influence an election? Was it neo-nazis, ANTIFA or other radical idealogues? Is it a purveyor of “fake news”? If social media outlets were required to KYC their potential clients then they will be able to weed out many of these organizations well before their propoganda reaches the eyes of their subscribers. Facebook has already stated that they want to avoid allowing groups such as these to influence their users via their platform. So it is highly reasonable to ask them to do it, or be penalized for failure to do so. Accountability is a powerful thing. Accountability means that it actuals gets done.

Speaking of “getting it done”, some of you may have seen Facebook’s complete about-face on its compliance with GDPR, moving 1.5 billion users out of Irish jurisdiction and to California where there are very limited legal restricitons. https://arstechnica.com/tech-policy/2018/04/facebook-removes-1-5-billion-users-from-protection-of-eu-privacy-law/

If you aren’t familiar with GDPR, it is Europe’s powerful new privacy law. For months, Facebook has publically stated how it would intend to comply with the law. But when push came to shove, their most recent move is to avoid the law and avoid compliance as much as possible. So flowery-language is something we often here from corporate executives on these matters, but in the end, they will still serve shareholders and profits first and foremost. So unless, these companies are forced to comply, don’t expect them to do it out of moral compunction, that’s rarely how companies operate.

Returning the the practical application of KYC, for a financial firm, this means that a salesperson has to have a reasonable relationship with their client, in order to assure that they are compliant with KYC. They need to know the client personally and be familiar with the source and usage of funds. If a financial firm fails to execute KYC and it turns out that the organization they are doing business with is charged with a crime, then the financial firm and the individuals involved would find swift ramifications, including substantive fines and potential jail time. This should apply to social media and the data industry.

Let me give you a nasty example. Have you looked at the amazing detail Facebook or Google have compiled about you? It is fairly terrifying and there are some out there (Gartner, for example) who have even predicted that your devices will know you better than your family knows you by 2022.

https://www.gartner.com/smarterwithgartner/emotion-ai-will-personalize-interactions/

Now assuming this is even close to true for many of us, then imagine where that information is sold to a PhD candidate at MIT, or other reputable AI program, except that PhD student, beyond doing his AI research is funnelling that data on to hackers on the dark web, or worse, to a nation-state engaged in cyberwarfare. How easy would it be for that group to cripple a large portion of the country? Or maybe, it has already happened, with examples like Equifax and its 143 million client breach. Can you be sure that the world’s largest hacks aren’t getting their start by accessing your data from a data reseller?

To be fair, in finance, often times you are taking in the funds and controlling the activities after the fact. You know what is going on. With data, often times you are selling access or actual data to the customer and no longer have control over their activities, it might seem. But this argument simply enhances my interest in Know Your Customer, because these firms may have little idea how this data is being used or misused. Better to slow down the gravvy train than to ride it into oblivion.

Obivously the details would need to be drafted and hammered out by Congress, but I am seeking support of the broader concept and encouraging supporters to add it to the legislative agenda. ForHumanity has a fairly comprehensive set of legislative proposals at this point which we would hope would be consider in the broad concept of AI policy. Questions, thoughts and comments are always welcome. This field of AI safety remains so new that we really should have a crowd-sourced approach to identify best-practices. We welcome you to join us in the process.