Connected Car Principles Part of Trend Toward Corporate Privacy Commitments

SIIA applauds the Alliance of Automobile Manufacturers and the Association of Global Automakers for adopting a set of privacy principles for vehicle technologies and services. These principles, released on Thursday November 13, are a responsible step toward protecting the information collected by connected cars, and they reflect an important movement of strong corporate commitments to data privacy.

Data has become essential for improving auto safety, reducing traffic congestion, increasing auto efficiency and powering other advanced services for 21st Century drivers. By bolstering consumer confidence, these principles will help make certain these advancements can continue.  They cover transparency, choice, respect for context, data minimization, de-identification, retention, data security, integrity, access and accountability.  In particular, they put limits on the use of geolocation information for marketing purposes and provide consumers with access to collected information.

These principles reflect widely accepted privacy standards advocated by the Federal Trade Commission and the Administration, and are also reflected in the U.S. sectorial privacy laws.

Led by SIIA, the education technology industry has followed a similar course of action.  In October, in conjunction with the Future of Privacy Forum, we announced a set of 12 privacy principles that commit educational service providers to protecting the privacy of student information.  This effort is vital for our country, because new information technologies are enabling personalized education, improving educational outcomes, allowing for more efficient school operations, making possible the development of better instructional materials and faster identification of students at risk of failing, and more.

In particular, the principles bolster the deeply entrenched social norm and legal requirement that student information should be used solely for educational purposes. The student privacy pledge also commits service providers to refrain from using student information for targeted advertising, to limit student profiles to educational uses and to protect information from unauthorized access.

Data collection and analysis is essential for any industry that intends to grow and flourish in the 21st century.  But to be successful, companies must have the trust and confidence of data subjects, customers, policy makers and the public.  Voluntary steps such as those taken by the automotive and education technology industries are vital if we are to hold onto and grow this trust.


Mark MacCarthy, Vice President, Public Policy at SIIA, directs SIIA’s public policy initiatives in the areas of intellectual property enforcement, information privacy, cybersecurity, cloud computing and the promotion of educational technology.

Should the Right to be Forgotten be Secret and Global?

Implementing the right to be forgotten was never going to be easy as earlier blogs in this series have pointed out.  But recent press reports show how tricking this implementing is going to be, revealing suggestions that search engines should take down the links globally and keep their actions secret.  Both of these ideas would be missteps.

The secrecy suggestion seems backed by common sense logic – it is self-defeating for search engines to announce to the world that they have taken down the links to stories that should be forgotten.  But that is not the concern, since search engines aren’t making such public announcements.  Rather they are informing the third-party publishers that a link to their content has been deleted from search results.  So the problem seems to be that if affected parties know that a link has been deleted they might object and this objection would direct attention to the topic that was to have been forgotten.

There is clearly room for debate on what the right policy is here.  Any added discussion of the take downs creates an added risk of creating exactly the kind of exposure the right to be forgotten is intended to avoid. But secrecy seems to be the wrong answer.  In fact, if search engines kept their deletions secret they would have faced accusations of lack of transparency! Publishers clearly have an interest in knowing that links to their content will no longer appear in certain search results.  For one thing it provides a check on the search engines getting it wrong, as apparently they did in the early days of implementing the take down program. And as long as the rest of the world isn’t simultaneously informed of the takedowns this seems a balanced approach.

The other concern seems to be that the new right to be forgotten will not be effective if the takedowns are purely local.  Why should people outside the EU be allowed to get search results that people inside the EU cannot get? So, the argument goes, search engines should delete links globally when they decide that they should be deleted under EU privacy law.

This is the wrong direction.  It improperly extends EU privacy law to the world. The impulse to limit information globally is understandable, but unworkable. We know this from other examples. For instance, it is easy to understand why Turkey objects to videos that denigrate the Turkish nation and would like to make sure that they are not shown anywhere in the world. But it goes too far to extend Turkish rules on hate speech to the entire world.  A reasonable compromise is to comply with Turkish law with respect to videos shown in Turkey.

This is the balance struck in many other areas of cross-border electronic commerce. Internet gambling rules are locally, not globally, enforced. British law permits and regulates Internet gambling, while US law prohibits it.  It would be an easy matter to structure US law so that global payment systems blocked all Internet gambling transactions. Bu that is not what US law does.  It provides for local enforcement. People in Britain can go on the Internet to gamble, while people in the US face restrictions, including restrictions on using payment cards at Internet gambling sites.  Examples are not hard to multiply – alcohol ads, for example, are not allowed in Saudi Arabia, but are permitted on websites available in other countries.

There is certainly nothing in the right to be forgotten decision that compels search engines to delete search results globally.  Moreover, earlier cases under EU law show a conscious desire to avoid the extraterritorial application of European privacy law. In the 2003 Bodil Lindquist case, for instance, the European Court of Justice rejected the idea that posting material on an EU website amounted to a transfer of data to other countries. It made this judgment precisely to avoid the implication that the entire Internet would be subject to EU jurisdiction.

Each country is entitled to its own privacy laws, Europe no less than the United States.  We should seek to make them sufficiently compatible at the edges so as to allow data transfers.  But simply extending European jurisdiction to the globe is the wrong way to go.


Mark MacCarthy, Vice President, Public Policy at SIIA, directs SIIA’s public policy initiatives in the areas of intellectual property enforcement, information privacy, cybersecurity, cloud computing and the promotion of educational technology.

Implementing the Right to Be Forgotten

It isn’t easy to implement the European Court of Justice’s right to be forgotten decision.  According to one press report, Google has received 70,000 requests for link removal through its online form since the program went into effect last month.  Another report says requests to remove links are arriving at an estimated rate of one every seven seconds.  As predicted here after the court’s decision in May, the results are not pretty. But the fault is not in implementation but in the flawed underlying decision that restricts free expression and puts substantial legal discretion in the hands of search engines.

Let’s recall how extreme the decision was.  It said that under European law privacy trumps free expression in the context of Internet search.  The right to respect for private life and the right to the protection of personal data “override, as a rule, not only the economic interest of the operator of the search engine but also the interest of the general public in finding that information upon a search relating to the data subject’s name.”  There can be an exception to this general rule: “…for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of inclusion in the list of results, access to the information in question.”

The court’s standard for determining whether privacy interests are implicated was whether the information was “inadequate, irrelevant or no longer relevant, or excessive…”  The court added that triggering privacy interests did not require a finding that “the inclusion of the information in question in that list causes prejudice to the data subject.” So what would trigger privacy interests? To say the court’s guidance is extraordinarily vague is an understatement.

Given the court’s reference to search engines in making access to information “appreciably easier” and playing “a decisive role” in the dissemination of information on the Internet, it is hard to avoid the conclusion that the intent of the court was to limit the effective dissemination of information on the Internet. But it did so by granting discretion to search engines to make some delicate value judgments and without specific guidance on how to make those judgments.

So how’s the implementation going? Certainly Google hasn’t done everything right. Taking down some links and then apparently restoring them certainly seems to be a misstep. But on the whole they’ve done a pretty balanced job.  They are requiring the filing of a request, including a statement on why release of the information would not be in the public interest.   There is no indication that they are granting all requests or turning all requests down.  They notify the publisher of the links removed from search results, but they do not reveal the identity of the person requesting the take down, since this would reveal the information that the data subject was trying to conceal.  They are following the law by limiting take down’s to EU citizens and to EU search results rather than extend the EU regime to the world.

Some commenters suggest that search engines are granting too many deletion requests and should instead routinely decline them all – which would force the data subjects to go data protection authorities or the courts to get links removed. [Read more...]

SIIA Releases Student Privacy Policy Guidelines & Recommendations During Testimony before the CA State Assembly

The safeguarding of student privacy and data security remains on the agenda for many state (and federal) policymakers. SIIA took the opportunity of its invited testimony before the California state legislature to release its new “Policy Guidelines for Building a Student Privacy Trust Framework.”

The SIIA guidelines outline principles and considerations to ensure policies are appropriately targeted to enhance student confidentiality while limiting unintended or unnecessary barriers to school operations or digital learning opportunities. SIIA shared many of these before the California State Assembly hearing  (see video starting at 33 minutes) on “Ensuring Student Privacy in the Digital Age,” hosted jointly by the Education and Select Privacy Committees.

Today, new technologies like cloud computing are enhancing school capacity, providing: adaptive and personalized learning, anytime, anywhere data access, enhanced data management functionality, powerful data analytics, and improved security. These tools and techniques allow educators to manage more data in more cost effective and sophisticated ways to inform instruction and enhance school productivity.

While a framework of laws and practices has been highly effective in safeguarding student confidentiality, we recognize the need to continually review policies and improve practices to enhance the trust framework between parents, schools and service providers.

We are pleased that stakeholders are doing just that in response to recent questions and concerns:

SIIA is working to inform legislators across the country as they develop and debate new regulation, but we are concerned some of the policy solutions may be ahead of and over-correct the actualized problems. It is important that new legislative requirements provide sufficient local flexibility, are not overly restrictive or impractical so as to discourage and stifle innovation, and are consistent with existing federal protections to avoid regulatory conflicts and stakeholder confusion.

We touched on several of our newly released policy guidelines at the California hearing:

First, new policies should limit the scope to student personally identifiable information as defined under federal law.

Second, new policies should focus on the need to educate, equip, and empower schools and educators to make informed decisions that safeguard student data and serve student learning. This can be accomplished through transparency by schools and service providers, by instituting local and state governance around data use policies, and by building capacity through investment in professional development, data security technology tools, and student digital literacy. These are important alternatives, or at least complements, to policy prohibitions that may not account for unique local and evolving circumstances.

Third, new policies should provide schools and agencies with the flexibility around the use of student information to meet their goals as determined locally within the existing framework of federal protections. SIIA agrees student personal information should not be used for non-educational purposes such as selling data to insurance companies or targeting insurance advertising. SIIA agrees it should be used only for the educational purposes for which it was entrusted. The challenge is translating these principles into statute in a manner future-proofed for the wave of digital learning transformation at home and at school. Use policies should distinguish between inappropriate commercial use of personal data for non-educational purposes and the appropriate actions of a for-profit (or non-profit) school service provider to use that information for educational uses authorized by its customers and federal law, for educational product evaluation, improvement, and development and to drive adaptive and customized learning at school and home.

Fourth, while SIIA agrees with the general practice to delete data when no longer needed for the purpose for which it was collected is the appropriate general practice, policies must differentiate around data type, use and control. For example, deletion decisions are most often under the direct control of the school (not the service provider), while new models provide for parent-consented and owned personal student accounts (and their data, apps and student-created resources). Further, absolute destruction is not appropriate where aggregated, de-identified and other anonymous data is often needed for ongoing educational purposes such as to power software algorithms or where personal information is needed for accountability systems or future transcript services.

Fifth, new policies governing local contract requirements must allow for flexibility between local schools and their service providers. Any state requirements should provide a template identifying what issues should be addressed rather than prescribing the specific terms for how.

SIIA agrees with the need to safeguard student data privacy and security. Further policy protections must be carefully crafted so that privacy protection floors do not inadvertently and unnecessarily lead to educational ceilings. SIIA instead encourages new policies to be focused on transparency, governance and capacity to empower parents and school officials to make sound and safe use of student information that advance student learning.


Mark SchneidermanMark Schneiderman is Senior Director of Education Policy at SIIA.

A Dark Day for Free Expression on the Internet

The European Court of Justice’s recent decision granting EU citizens a right to be forgotten by search engines is a major blow to free expression on the Internet.  Reaction from media outlets like the New York Times and the Financial Times has been harshly critical and rightly so.   The key thing for Internet users and for public policymakers in Europe is to understand how this ruling might reduce the amount of accurate information available on the Internet.

The decision did not spring from any impulse to censorship, but from an honest attempt to vindicate the fundamental right to privacy in a digital age.  That’s why any comparison to authoritarian government censorship of the Internet is just overblown rhetoric. But unless it is modified or re-interpreted through further jurisprudence or legislation, this decision might well be the turning point where free expression on the Internet begins to recede from its current high water mark.

What’s the threat to free expression?  The court attempts to balance the interest of search engine users in access to information and the privacy interests of individuals who are the subject of lawfully published material available on the Internet.  In making that balance, however, the court says:  “the data subject’s rights… override, as a general rule, that interest of internet users…”  That’s the problem in a nutshell: under the decision privacy trumps free expression on the Internet.

The court envisages a process in which a person who thinks that search results are an intrusion into his private life presents a complaint to the search engine stating that one or more links in the search results refer to data that appear to be “inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they were processed and in the light of the time that has elapsed.”  The search engine then must “duly examine” this complaint and if it finds that the links meet this standard of “inadequacy, irrelevance or excessiveness” it must delete these links from its search result.  This must be done even if the data are accurate and their initial publication lawful.  An exception from this general requirement to delete links allows the search engine to retain the links in search results when “there are particular reasons, such as the role played by the data subject in public life, justifying a preponderant interest of the public in having access to the information when such a search is made.”

This process is heavily weighted in favor of a complainant, and allows free expression to function only as a defense against a finding of a privacy violation.

The particular case before the court illustrates the process. A Spanish man incurred certain debts many years ago, which was reported accurately in a newspaper at the time.  He has since cleared up the debts.  But a search of his name today returns the original story in a prominent place, thereby recirculating true but outdated information about him.  He asked the search engine to remove these links.  Under the new regime, the search engine would be required to go through the above process using the new standard and if it finds that the information is inadequate, irrelevant or excessive it must consider whether the role played by the physician in public life gives the public a preponderant interest in access to the information.  If not, it must delete the links.

Search engines must assess what this ruling means in terms of their internal policies and practices and seek to bring them into compliance with the ruling.  The cost and burden to these companies are important and might make operating an effective search engine in Europe a nearly impossible task.

But the real impact of the ruling is that is it likely to reduce the amount of accurate information available on the Internet.

Even a preliminary review of the decision reveals substantial challenges:

What do the new standards mean? Are there really three different bases for deleting search results – irrelevance, inadequacy or excessiveness? A standard of excessiveness is particularly troubling and could potentially require search engines to assess whether a publisher gathered too much accurate information about a person.

What role do other interested parties have in a complaint?  The original publisher, for instance, might not want links to his stories suppressed in search results.  Other people and organizations are typically mentioned in published stories.  What if they want links to the stories available and think their rights are violated by suppression?  Are the search engines supposed to convene a process to allow all interested parties to present evidence as if they were a court?

How broadly does the ruling apply?  It covers search engines, but many companies are in the business of aggregating lawfully acquired accurate information from a variety of public and private sources and making it available to the public.  The same story that the search engine would have to delete is also available in thousands of commercially available databases throughout the world.  Are those providers of information services subject to deletion demands from EU data subjects even if they are not based in the EU?

These are just preliminary questions that must be clarified going forward.  But a new day of privacy-based deletion requests is dawning.  Unless EU policymakers intervene, the new day is likely to be a dark one for free expression.


Mark MacCarthy, Vice President, Public Policy at SIIA, directs SIIA’s public policy initiatives in the areas of intellectual property enforcement, information privacy, cybersecurity, cloud computing and the promotion of educational technology.

Digital Policy Roundup

Administration Releases Long-Awaited Study on “Big Data” and Privacy

On May 1, the White House released its long-awaited report on “big data and privacy.” The report, entitled “Big Data: Seizing Opportunities, Preserving Values,” is the result of a 90 day study directed by President Obama in January. Overall, the report captures the great opportunities presented by data-driven innovation, and it highlights a wide range of conclusions and makes concrete recommendations for Administration attention and policy development in a few key areas. As highlighted by the study’s lead, John Podesta, the report represents a starting point for an increased focus on policy issues related to big data by the Obama Administration.

In response to the study, SIIA released a press statement welcoming the report and highlighting the effectiveness of current legal and regulatory framework to accommodate privacy and security concerns associated with big data. SIIA also supports the specific proposals in the report about maximizing the educational benefits of data and making an important contribution to the International discussion.

SIIA is thoroughly reviewing the White House study, as well as a related study issued by the President”s Council of Advisors on Science and Technology (PCAST), which takes a more detailed and “technological perspective” on big data and privacy. We will provide a detailed summary and analysis of the reports for members in the near future.

President Obama and Chancellor Merkel Repeat Positions on Privacy/Surveillance

At a May 2 press conference the President reiterated that he had “taken the unprecedented step of ordering our intelligence communities to take the privacy interests of non-U.S. persons in everything they do, something that’s not been done before and most other countries in the world do not do.” Obama also said that the United States was committed to a “cyberdialogue” with Germany. He was firm, however, that there would be no “no spy” agreement between the two countries.

The Chancellor said: “Under the present conditions, we have, (after all ?), possibilities, as regards differences of opinion, to overcome these differences in the medium term and in the long term.” She mentioned the U.S.-Germany cyberdialogue, the U.S.-EU Safe Harbor Framework negotiations, and the Eu’s proposed General Data Protection Regulation. Chancellor Merkel also called for more cooperation between parliaments, i.e. the U.S. Congress and the European Parliament. The German leaders mentioned “proportionality” as one issue still dividing the United States and Germany. What that means is that from the German perspective, national security-related privacy exceptions must be “proportional” to the national security risk at hand.


David LeDuc is Senior Director, Public Policy at SIIA. He focuses on e-commerce, privacy, cyber security, cloud computing, open standards, e-government and information policy. Follow the SIIA public policy team on Twitter at @SIIAPubPolicy.

Administration’s Report on Big Data an Important Contribution to the International Discussion

The Software & Information Industry Association welcomed the Administration’s report on “Big Data: Seizing Opportunities, Preserving Values.”  The authors of the report have done a good job in identifying opportunities, while at the same time wrestling with the challenges of preserving what the report correctly suggests are global values, specifically on privacy.  Indeed, the report is commendable in the international context it provides.

The section starting on page 15 on U.S. Privacy Law and International Privacy Frameworks provides useful context and historical background.  It is worthwhile recalling that while privacy regimes around the world have developed in different ways, the Fair Information Practices Principles (FIPPs), articulated first in the United States in the early seventies, have been influential around the world in developing privacy rules.  Given that common heritage, with imagination, creativity and hard work, SIIA believes the different privacy regimes that exist in the world can function in an interoperable fashion.  Indeed, with respect to the interoperability of different privacy frameworks, the report calls for:

“The Administration should also work to strengthen the U.S.-European Union Safe Harbor Framework, encourage more countries and companies to join the APEC Cross Border Privacy Rules system, and promote collaboration on data flows between the United States, Europe and Asia through efforts to align Europe’s system of Binding Corporate Rules and the APEC CBPR system.”

SIIA and its members are actively engaged in all of these efforts.  We are especially hopeful that the United States and the European Union can conclude by this summer, as planned, revisions to the U.S.-EU Safe Harbor Framework.

The report also notes:

“Privacy is a worldwide value that the United States respects and which should be reflected in how it handles data regarding all persons. For this reason the United States should extend privacy protections to non-U.S. persons.”

As a step toward this goal, the report recommends that:

“The Office of Management and Budget should work with departments and agencies to apply the Privacy Act of 1974 to non-U.S. persons where practicable, or to establish alternative privacy policies that apply appropriate and meaningful protections to personal information regardless of a person’s nationality.”

We urge the Administration to move forward in accordance with this recommendation.  Our view is that if this is implemented, this could be a building block in reinforcing trust overseas which would be helpful to America’s innovative industries, many of whom are SIIA members.

There is much work to be done domestically and internationally in order to ensure, as the report title so aptly puts it, that opportunities are seized and values preserved.  We think that this document provides a valuable input, both nationally and internationally, on Big Data’s current and potential contributions, as well as what can and should be done in a practical sense to ensure that privacy values are not only protected but also promoted.


Carl Schonander is Director of International Public Policy at SIIA.

Curated By Logo