arrow_rightarrow_leftarrow_rightarrow_leftCloseFacebookFacebookLessLinkedinMailicon_menuicon_menuPlayPlusPlusTwitterTwitterTwitterGroup 11Para
August 26, 2016

blue tone city scape and network connection concept

Share your thoughts with us

  • Lisa Neitzel

    Yes, I think it’s a civic responsibility. Miguel Luengo-Oroz at UN Global Pulse said, “At the end of the day, citizens are creating the data, therefore it should be used to protect human rights in all its forms. The data revolution has been underway in the private sector for a long time. Most of the data that is out there is held, used, and monetized by private companies. Forward thinking private sector companies that have this valuable data, can share it in ways that fully respect the privacy of their customers, doesn’t compromise their own competitiveness in the market, and yet could be used to make public policies and understand what’s happening while it’s still happening.”

  • Kathryn Watson

    Data is in large part free already. What is not free or available is the aggregation and interpretation. The insight is not free. People are still our biggest impediment in understanding the data, so it’s important to get this data in the hands of the right researchers and data scientists who can understand and interpret for social good.

  • Alphie Ali

    Absolutely. To help the masses, one must first understand the masses – and nothing helps us do that like data does.

  • Shelley Noel

    As long as business value or trade secrets are not compromised, why not? Of course that would mean that we’d need organizations to shepard this kind of transaction – ensure the researchers aren’t disclosing the data to other parties, or able to reveal sensitive proprietary information, etc. A collective where companies can sign up to be part of a data sharing program (a little like the new national health exchange) could ensure the proper processes are being followed. This is a great idea. So much data, and we’re barely using it! What better way to use it than to advance social welfare and environmental efficiencies.

  • delicatetoe

    The question is misleading. The idea of so-called “anonymous” data is a myth. The question presupposes that it is possible to divorce PII from aggregate data. This is simply not the case. Links below for reference.

    Yes, we want to use data for social good, but making data more readily available also carries with it an inherent risk of abuse and/or misuse by criminals and oppressive governments who punish political dissidents, like in China, the Middle East, and Russia. There are even examples of the US government doing the same (link below). Even using TOR has proven useless in hiding consumer traffic. Device fingerprinting is so accurate, virtually no number of VPN or proxy routing can defeat it. So, before we can have a discussion about what to do with this data, we have to figure out how to meaningfully and thoroughly anonymize users from the data they create. This burden must be shouldered by the companies who collect the data.

    It’s no mystery that consumers have all but given up the notion of privacy. News headlines talk about the so-called “post-privacy world” we have created. Everything we do, say, and think is monitored, analyzed, dissected, and monetized. Every privacy policy I have ever read (and yes, I’ve read every single one I’ve agreed to) almost always starts with the phrase “we take your privacy seriously.” No they don’t. There is no evidence to support such a statement, so why do companies always open with that?

    Too many companies today treat the data generated by their customers with reckless abandon. They share it with anyone willing to pay for it. The advertising and social media industries are particularly notorious for this (link below). I think that the general public feels apathetic and powerless about how their data is used, shared, and analyzed. Data companies are well aware of this apathy, and continue to scrape away at any notion of privacy. The government supports privacy erosion by making the argument that “if you have nothing to hide, why do you need privacy?” I’m going to avoid going on a tangent about the counterargument to that, but just know that there are a multitude of reasons that any law-abiding citizen should value privacy, including the fact that, for example, insurance companies will analyze your social media behavior, contacts, and posts to assess your rates, for example (link below).

    On a personal note, I would like to see companies that collect, generate, and store data do more to empower consumers by giving them a choice about what data is collected and with whom it is shared. The current scheme in the digital market is that one must comply with a lengthy privacy policy full of incomprehensible legal jargon that most consumers can’t decipher, so they don’t even bother to actually click the link to read the policy to which they’re agreeing. How many of us just click “agree” when confronted with a terms and conditions agreement? There is even question as to whether this constitutes informed consent and could pose a degree of legal liability in the event that the data is misused. The current expectation of almost every company is to agree to sharing their data, or being barred from using the service or purchasing the product. For a privacy advocate like me, that’s a non-starter. Fortunately, I’ve been blessed with the gift of sufficient cognition to understand the consequences of being careless with my data. Most consumers may not be aware of these potential risks, or more likely, they feel powerless to choose.

    Ethically, the correct answer is to obtain informed consent by having the default behavior of all data collection practices to be that the user has to explicitly opt-in. I understand this may potentially affect the bottom line of some companies, but the consequences of not moving to this scheme could be that whistleblowers, journalists, and civil rights activists are jailed or worse in foreign countries when their “anonymous” user data is analyzed by the government they’re being oppressed by, and used to track and/or locate and prosecute them.

    Much like the US government’s argument about cryptography being used to hide criminal activity, anonymizing data and giving consumers meaningful choice has positive and negative consequences. True online anonymity introduces the risk of things like online bullying, but it also allows political dissidents and whistleblowers the freedom to share their ideas with the world without fear of retribution. I would argue that using aggregate data for “social good” also introduces risk to those trying to do “social good” in parts of the world where they may not necessarily enjoy the same freedoms we do.

    Bottom line is that I’d like to see more done to meaningfully protect user data and give consumers more meaningful choices about how their data is stored, used, and collected. I believe the benefits of giving more consumer choices and protections outweighs the potential benefits of making this data available to more sources. I believe it would boost consumer confidence if they’re given a choice.

  • roddybgoode

    Yes. Hidden in the ore of big data could be little gems. Apps that suggest movies, clothing, food, destinations, etc. – could anonymously guide people towards their spheres of interest. Economic growth. General welfare. Just a little pattern, if found and harnessed among what otherwise might look like white noise, could help all of us. We can imagine how a crystal lattice might begin to form in the annealing process. Yes, share the data and make it accessible.